Failed to create SparkContext

I am testing Spark in a spark shell with scala code. I am creating a prototype for using Kafka and Spark.

I ran spark-shell as shown below.

 spark-shell --jars ~/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar 

And I ran the code in the shell.

 import kafka.serializer.StringDecoder import org.apache.spark.streaming._ import org.apache.spark.streaming.kafka._ import org.apache.spark.SparkConf // Create context with 2 second batch interval val sparkConf = new SparkConf().setAppName("DirectKafkaWordCount") val ssc = new StreamingContext(sparkConf, Seconds(2) ) 

Then I found an error while creating ssc . spark-shell told me a message as shown below.

 scala> val ssc = new StreamingContext(sparkConf, Seconds(2) ) 15/06/05 09:06:08 INFO SparkContext: Running Spark version 1.3.1 15/06/05 09:06:08 INFO SecurityManager: Changing view acls to: vagrant 15/06/05 09:06:08 INFO SecurityManager: Changing modify acls to: vagrant 15/06/05 09:06:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vagrant); users with modify permissions: Set(vagrant) 15/06/05 09:06:08 INFO Slf4jLogger: Slf4jLogger started 15/06/05 09:06:08 INFO Remoting: Starting remoting 15/06/05 09:06:08 INFO Utils: Successfully started service 'sparkDriver' on port 51270. 15/06/05 09:06:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp:// sparkDriver@localhost :51270] 15/06/05 09:06:08 INFO SparkEnv: Registering MapOutputTracker 15/06/05 09:06:08 INFO SparkEnv: Registering BlockManagerMaster 15/06/05 09:06:08 INFO DiskBlockManager: Created local directory at /tmp/spark-d3349ba2-125b-4dda-83fa-abfa6c692143/blockmgr-c0e59bba-c4df-423f-b147-ac55d9bd5ccf 15/06/05 09:06:08 INFO MemoryStore: MemoryStore started with capacity 267.3 MB 15/06/05 09:06:08 INFO HttpFileServer: HTTP File server directory is /tmp/spark-842c15d5-7e3f-49c8-a4d0-95bdf5c6b049/httpd-26f5e751-8406-4a97-9ed3-aa79fc46bc6e 15/06/05 09:06:08 INFO HttpServer: Starting HTTP Server 15/06/05 09:06:08 INFO Server: jetty-8.yz-SNAPSHOT 15/06/05 09:06:08 INFO AbstractConnector: Started SocketConnector@0.0.0.0 :55697 15/06/05 09:06:08 INFO Utils: Successfully started service 'HTTP file server' on port 55697. 15/06/05 09:06:08 INFO SparkEnv: Registering OutputCommitCoordinator 15/06/05 09:06:08 INFO Server: jetty-8.yz-SNAPSHOT 15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0 :4040: java.net.BindException: Address already in use java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.spark-project.jetty.server.Server.doStart(Server.java:293) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209) at org.apache.spark.ui.WebUI.bind(WebUI.scala:120) at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:309) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48) at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50) at $line35.$read$$iwC$$iwC.<init>(<console>:52) at $line35.$read$$iwC.<init>(<console>:54) at $line35.$read.<init>(<console>:56) at $line35.$read$.<init>(<console>:60) at $line35.$read$.<clinit>(<console>) at $line35.$eval$.<init>(<console>:7) at $line35.$eval$.<clinit>(<console>) at $line35.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@e067ac3 : java.net.BindException: Address already in use java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.spark-project.jetty.server.Server.doStart(Server.java:293) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209) at org.apache.spark.ui.WebUI.bind(WebUI.scala:120) at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:309) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48) at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50) at $line35.$read$$iwC$$iwC.<init>(<console>:52) at $line35.$read$$iwC.<init>(<console>:54) at $line35.$read.<init>(<console>:56) at $line35.$read$.<init>(<console>:60) at $line35.$read$.<clinit>(<console>) at $line35.$eval$.<init>(<console>:7) at $line35.$eval$.<clinit>(<console>) at $line35.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/stage/kill,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/static,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/executors/threadDump/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/executors/threadDump,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/executors/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/executors,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/environment/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/environment,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/storage/rdd/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/storage/rdd,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/storage/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/storage,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/pool/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/pool,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/stage/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/stage,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/stages,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/jobs/job/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/jobs/job,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/jobs/json,null} 15/06/05 09:06:08 INFO ContextHandler: stopped osjsServletContextHandler{/jobs,null} 15/06/05 09:06:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 15/06/05 09:06:08 INFO Server: jetty-8.yz-SNAPSHOT 15/06/05 09:06:08 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0 :4041 15/06/05 09:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4041. 15/06/05 09:06:08 INFO SparkUI: Started SparkUI at http://localhost:4041 15/06/05 09:06:08 INFO SparkContext: Added JAR file:/home/vagrant/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar at http://10.0.2.15:55697/jars/spark-streaming-kafka-assembly_2.10-1.3.1.jar with timestamp 1433495168735 15/06/05 09:06:08 INFO Executor: Starting executor ID <driver> on host localhost 15/06/05 09:06:08 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp:// sparkDriver@localhost :51270/user/HeartbeatReceiver 15/06/05 09:06:08 INFO NettyBlockTransferService: Server created on 37393 15/06/05 09:06:08 INFO BlockManagerMaster: Trying to register BlockManager 15/06/05 09:06:08 INFO BlockManagerMasterActor: Registering block manager localhost:37393 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 37393) 15/06/05 09:06:08 INFO BlockManagerMaster: Registered BlockManager org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:80) org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016) $iwC$$iwC.<init>(<console>:9) $iwC.<init>(<console>:18) <init>(<console>:20) .<init>(<console>:24) .<clinit>(<console>) .<init>(<console>:7) .<clinit>(<console>) $print(<console>) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:606) org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1812) at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1808) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1808) at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1795) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1795) at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1847) at org.apache.spark.SparkContext.<init>(SparkContext.scala:1754) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48) at $iwC$$iwC$$iwC.<init>(<console>:50) at $iwC$$iwC.<init>(<console>:52) at $iwC.<init>(<console>:54) at <init>(<console>:56) at .<init>(<console>:60) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

I wonder why StreamingContext makes a mistake. Could you solve this problem?

I also checked the 4040 port.

This is a list of open ports before starting spark-shell .

 vagrant@vagrant-ubuntu-trusty-64 :~$ netstat -an | grep "LISTEN " tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN tcp 0 0 0.0.0.0:47078 0.0.0.0:* LISTEN tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN tcp6 0 0 :::22 :::* LISTEN tcp6 0 0 :::44461 :::* LISTEN tcp6 0 0 :::111 :::* LISTEN tcp6 0 0 :::80 :::* LISTEN 

And this is a list of open ports after starting spark-shell .

 vagrant@vagrant-ubuntu-trusty-64 :~$ netstat -an | grep "LISTEN " tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN tcp 0 0 0.0.0.0:47078 0.0.0.0:* LISTEN tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN tcp6 0 0 :::22 :::* LISTEN tcp6 0 0 :::55233 :::* LISTEN tcp6 0 0 :::4040 :::* LISTEN tcp6 0 0 10.0.2.15:41545 :::* LISTEN tcp6 0 0 :::44461 :::* LISTEN tcp6 0 0 :::111 :::* LISTEN tcp6 0 0 :::56784 :::* LISTEN tcp6 0 0 :::80 :::* LISTEN tcp6 0 0 :::39602 :::* LISTEN 
+5
source share
6 answers

By default, SparkContext 'sc' is created when the spark shell starts. The constructor method you are using is trying to create another instance of SparkContext that you should not do. What you really have to do is use an existing sparkContext to build a StreamingContext using an overloaded constructor

 new StreamingContext(sparkContext: SparkContext, batchDuration: Duration) 

So, now your code should look like this:

 // Set the existing SparkContext Master, AppName and other params sc.getConf.setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" ) // Use 'sc' to create a Streaming context with 2 second batch interval val ssc = new StreamingContext(sc, Seconds(2) ) 
+8
source

You can change the port of the Spark interface using the property in the Spark configuration:

 spark.ui.port=44040 
+4
source

If you run the "spark shell", basically the single spark context sc works. If you need to create a new spark context for streaming, you need to use a port other than 4040 because it is allocated by the 1st spark context.

So finally, I wrote code, as shown below, to create a different spark context for the stream process.

 import kafka.serializer.StringDecoder import org.apache.spark.streaming._ import org.apache.spark.streaming.kafka._ import org.apache.spark.SparkConf // Create context with 2 second batch interval val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" ).set("spark.driver.allowMultipleContexts", "true") val ssc = new StreamingContext(conf, Seconds(2) ) .... 

Thanks to everyone who offers the solution .; -)

+1
source

I came here to find this answer: I tried to connect to the cassandra through a spark shell. Since intrinsically safe sc works by default, I get an error message:

 Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 

All I needed to do:

 sc.stop 

[I know this does not answer the question above. But this seems to be the only question in stackoverflow that comes up when searching, while others may find it useful.

+1
source

Maybe not the same case, but I had a similar warning like "WARN util.Utils: Service" SparkUI "could not be connected on port 4040. An attempt was made on port 4041." I restarted the car, then everything is in order. I started the spark shell and saw scala>

+1
source

I encountered the same problem when starting the spark shell. I resolve it below, first I go to the spark / sbin directory, after that I started the spark with this command,

  ./start-all.sh 

or you can use ./start-master.sh and ./start-slave.sh for this. Now, if you run spark-shell or pyspark or any other spark component, it will automatically create a sc highlight object for you.

0
source

All Articles