Sparks: error SC value not found

I just started with Spark. I have a CDH5 installed using Spark. However, when I try to use sparkcontext, it gives an error as shown below

<console>:17: error: not found: value sc val distdata = sc.parallelize(data) 

I researched about this and found error: not found: value sc

and tried to start the spark context with. / spark -shell. It gives the error No such File or Directory

+6
source share
3 answers

You can start spark-shell starting with ./ if you are in its exact directory or path/to/spark-shell if you are somewhere else.

Also, if you use a script with spark-submit , you must first initialize sc as a SparkContext :

 import org.apache.spark.SparkContext import org.apache.spark.SparkConf val conf = new SparkConf().setAppName("Simple Application") val sc = new SparkContext(conf) 
+2
source

Add the spark catalog to the path, then you can use the spark shell from anywhere.

Add import org.apache.spark.SparkContext if you use it in your spark-send job to create a spark context using:

 val sc = new SparkContext(conf) 

where conf is already defined.

+2
source

Starting a new terminal fixes the problem in my case.

0
source

All Articles