Error creating StreamingContext

I open the spark shell

spark-shell --packages org.apache.spark:spark-streaming-kafka_2.10:1.6.0 

Then I want to create a streaming context

 import org.apache.spark._ import org.apache.spark.streaming._ val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.driver.allowMultipleContexts", "true") val ssc = new StreamingContext(conf, Seconds(1)) 

I came across an exception:

 org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: 
+6
source share
1 answer

When you open the spark shell, an already streaming context is created. It is called sc, which means you do not need to create a configure object. Just use an existing sc object.

 val ssc = new StreamingContext(sc,Seconds(1)) 

instead of var we will use val

+11
source

All Articles