Tôi muốn bật cụm đơn trong Apache Spark, tôi đã cài đặt java và scala. Tôi đã tải về tia lửa cho Apache Hadoop 2.6 và giải nén. Tôi đang cố gắng để biến spark-shell nhưng ném cho tôi một lỗi, ngoài ra, tôi không có quyền truy cập để sc trong vỏ. Tôi biên soạn từ nguồn nhưng cùng một lỗi. Tôi đang làm gì sai?Lỗi Apache Spark khi bắt đầu
Welcome to
____ __
/__/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.3.1
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
java.net.BindException: Failed to bind to: ADMINISTRATOR.home/192.168.1.5:0: Service 'sparkDriver' failed after 16 retries!
\t at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
\t at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
\t at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)
\t at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
\t at scala.util.Try$.apply(Try.scala:161)
\t at scala.util.Success.map(Try.scala:206)
\t at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
\t at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
\t at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
\t at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
\t at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
\t at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
\t at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
\t at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
\t at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
\t at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
\t at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
\t at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
\t at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
\t at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
\t at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
java.lang.NullPointerException
\t at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:145)
\t at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:49)
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
\t at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
\t at java.lang.reflect.Constructor.newInstance(Unknown Source)
\t at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1027)
\t at $iwC$$iwC.<init>(<console>:9)
\t at $iwC.<init>(<console>:18)
\t at <init>(<console>:20)
\t at .<init>(<console>:24)
\t at .<clinit>(<console>)
\t at .<init>(<console>:7)
\t at .<clinit>(<console>)
\t at $print(<console>)
\t at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
\t at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
\t at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
\t at java.lang.reflect.Method.invoke(Unknown Source)
\t at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
\t at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
\t at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
\t at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
\t at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
\t at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
\t at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
\t at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
\t at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130)
\t at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
\t at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
\t at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
\t at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973)
\t at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
\t at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
\t at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
\t at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
\t at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
\t at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
\t at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
\t at org.apache.spark.repl.Main$.main(Main.scala:31)
\t at org.apache.spark.repl.Main.main(Main.scala)
\t at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
\t at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
\t at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
\t at java.lang.reflect.Method.invoke(Unknown Source)
\t at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
\t at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
\t at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
\t at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
\t at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
^
scala>
bạn có thể vui lòng thêm hình thức văn bản của ngoại lệ? Các đối số của lệnh 'spark-shell.bat' là gì? Bạn đã thử 'spark-shell.bat --master local [*]' hay cái gì? –
với -Master đã thử nhưng không có gì: Próbowałem z atrybutem -master ale nie idzie – Mateusz
Bạn nên chỉnh sửa câu hỏi với thông tin bổ sung về sự cố và ngoại lệ. –