Java.lang.ClassNotFoundException, when I use "spark-submit" with the new class name, not "SimpleApp",

I wrote a spark program with scala, but when I use spark-submit to submit my project, I met java.lang.ClassNotFoundException.

my.sbt file:

name:="Spark Project" version:="1.0" scalaVersion:="2.10.5" libraryDependencies+="org.apache.spark" %% "spark-core" % "1.3.0" 

My name .scala is SparkProject.scala, and in it the name of the object is also SparkProject.

 /* SparkProject.scala */ import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf object SparkProject { def main(args: Array[String]) { val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system val conf = new SparkConf().setAppName("Simple Application") val sc = new SparkContext(conf) val logData = sc.textFile(logFile, 2).cache() val numAs = logData.filter(line => line.contains("a")).count() val numBs = logData.filter(line => line.contains("b")).count() println("Lines with a: %s, Lines with b: %s".format(numAs, numBs)) } } 

My command to submit the project:

 spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar 

Does anyone know how to solve this? Finally, what makes me embarrassed when I try to give an example here [ http://spark.apache.org/docs/latest/quick-start.htmlโ€ ,it works well. But when I create a new project and submit it, it goes wrong. Any help would be greatly appreciated.

+5
source share
5 answers

Adding a package name to me.

My code is simple too:

 package spark.wordcount /* SimpleApp.scala */ import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf object WordCount { def main(args: Array[String]) { val infile = "/input" // Should be some file on your system val conf = new SparkConf().setAppName("word count") val sc = new SparkContext(conf) val indata = sc.textFile(infile, 2).cache() val words = indata.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey((a,b) => (a+b)) words.saveAsTextFile("/output") println("All words are counted!") } } 

I tried to launch spark-submit as follows: [root @sparkmaster bin] #. / Spark-submit -class spark.wordcount.WordCount / opt / spark-wordcount-in-scala.jar and it works successfully.

+5
source

Removing the package name works for me.

0
source

I got the same error when starting a spark on windows and creating using sbt.

I did not have the line "package xxx" (for example, package spark.wordcount) in the code. If you have no mention of the "xxx package" in your code, run the command

 spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar 

should work as @SarveshKumarSingh mentioned in the comment section of the question.

But I got an error for another reason. The scale file I created in notepad was SimpleApp.scala.txt . When I saved it correctly as SimpleApp.scala , it worked.

0
source

My two cents

I tried all the solutions mentioned - they are all valid. In my case, I changed the package name after running the sbt package
Therefore, I received the indicated error. When recompiling ie "$ sbt package", this worked for me.

0
source

For those who are still looking for a solution. I tried all the other answers, but did not work with me. This works for me when I remove double quotes in the class name. You can try spark-submit --class SparkProject --master local [12]

0
source

All Articles