You can start spark-shell starting with ./ if you are in its exact directory or path/to/spark-shell if you are somewhere else.
Also, if you use a script with spark-submit , you must first initialize sc as a SparkContext :
import org.apache.spark.SparkContext import org.apache.spark.SparkConf val conf = new SparkConf().setAppName("Simple Application") val sc = new SparkContext(conf)
source share