Java.lang.NoClassDefFoundError: org / apache / spark / Logging

I always get the following error. Can someone help me?

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at com.datastax.spark.connector.japi.DStreamJavaFunctions.<init>(DStreamJavaFunctions.java:24) at com.datastax.spark.connector.japi.CassandraStreamingJavaUtil.javaFunctions(CassandraStreamingJavaUtil.java:55) at SparkStream.main(SparkStream.java:51) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 20 more 

When I compile the following code. I searched on the Internet but could not find a solution. I have an error when I added saveToCassandra.

 import com.datastax.spark.connector.japi.CassandraStreamingJavaUtil; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.streaming.Duration; import org.apache.spark.streaming.api.java.JavaDStream; import org.apache.spark.streaming.api.java.JavaPairInputDStream; import org.apache.spark.streaming.api.java.JavaStreamingContext; import org.apache.spark.streaming.kafka.KafkaUtils; import java.io.Serializable; import java.util.Collections; import java.util.HashMap; import java.util.Map; import java.util.Set; import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow; /** * Created by jonas on 10/10/16. */ public class SparkStream implements Serializable{ public static void main(String[] args) throws Exception{ SparkConf conf = new SparkConf(true) .setAppName("TwitterToCassandra") .setMaster("local[*]") .set("spark.cassandra.connection.host", "127.0.0.1") .set("spark.cassandra.connection.port", "9042"); ; JavaSparkContext sc = new JavaSparkContext(conf); JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(5000)); Map<String, String> kafkaParams = new HashMap<>(); kafkaParams.put("bootstrap.servers", "localhost:9092"); Set<String> topics = Collections.singleton("Test"); JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils.createDirectStream( ssc, String.class, String.class, kafka.serializer.StringDecoder.class, kafka.serializer.StringDecoder.class, kafkaParams, topics ); JavaDStream<Tweet> createTweet = directKafkaStream.map(s -> createTweet(s._2)); CassandraStreamingJavaUtil.javaFunctions(createTweet) .writerBuilder("mykeyspace", "rawtweet", mapToRow(Tweet.class)) .saveToCassandra(); ssc.start(); ssc.awaitTermination(); } public static Tweet createTweet(String rawKafka){ String[] splitted = rawKafka.split("\\|"); Tweet t = new Tweet(splitted[0], splitted[1], splitted[2], splitted[3]); return t; } } 

My pom is next.

 <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.company</groupId> <artifactId>Sentiment</artifactId> <version>1.0-SNAPSHOT</version> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.8</source> <target>1.8</target> </configuration> </plugin> </plugins> </build> <repositories> <repository> <id>twitter4j.org</id> <name>twitter4j.org Repository</name> <url>http://twitter4j.org/maven2</url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>true</enabled> </snapshots> </repository> </repositories> <dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>2.0.1</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> <version>2.0.1</version> </dependency> <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library --> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>2.11.8</version> </dependency> <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 --> <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector_2.10</artifactId> <version>1.6.2</version> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka_2.10</artifactId> <version>0.9.0.0</version> </dependency> <dependency> <groupId>org.twitter4j</groupId> <artifactId>twitter4j-core</artifactId> <version>[4.0,)</version> </dependency> <dependency> <groupId>org.twitter4j</groupId> <artifactId>twitter4j-stream</artifactId> <version>4.0.4</version> </dependency> <dependency> <groupId>org.twitter4j</groupId> <artifactId>twitter4j-async</artifactId> <version>4.0.4</version> </dependency> </dependencies> </project> 
+7
java maven cassandra apache-spark spark-cassandra-connector
source share
10 answers

org.apache.spark.Logging is available in version 1.5.2 or later. This is not in 2.0.0. Pls change the version as follows

  <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>1.5.2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.5.2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>1.5.2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> <version>1.6.2</version> </dependency> 
+10
source share

The error is due to the fact that you are using Spark 2.0 libraries with a connector from Spark 1.6 (which is looking for the Spark 1.6 registration class). Use version 2.0.5 for the connector.

+6
source share

I got a solution by changing the above jar.

Initially, I had a degraded can for a kafka spark:

 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> <version>2.1.1</version></dependency> 

I also removed the multi-level sl4j-log4j.jars and log4j.jars, which I added externally from the spark library and kafka jar.

+1
source share

One of the reasons that may cause this problem is the conflict between lib and class . I ran into this problem and solved it with some maven exceptions :

 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.0.0</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>2.0.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-10_2.11</artifactId> <version>2.0.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency> 
0
source share

this pom.xml solves my problem:

  <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.6.1</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.6.1</version> </dependency> 
0
source share

I downloaded the jar and used --jars in spark-submit, spark-submit worked for me --class com.SentimentTwiteer --packages "org.apache.spark: spark-streaming-twitter_2.11: 1.6.3" --jars / root / Desktop / spark-core_2.11-1.5.2.logging.jar / root / Desktop / SentimentTwiteer.jar XX XX XX XX

0
source share

Download spark-core_2.11-1.5.2.logging.jar and use -jar as an option

spark-submit --class com.SentimentTwiteer --packages "org.apache.spark: spark-streaming-twitter_2.11: 1.6.3" --jars / root / Desktop / spark-core_2.11-1.5. 2.logging.jar / root / Desktop / SentimentTwiteer.jar consumerKey consumerSecret accessToken accessTokenSecret yoursearchTag

https://github.com/sinhavicky4/SentimentTwiteer

0
source share

This is due to the lack of the org.apache.spark.Logging class since 1.5.2, as everyone says. (Only org.apache.spark.internal.Logging exists in a later version ...)

But it seems that none of the solutions against maven can solve this dependency, so I'm just trying to add this class to lib manually. Here is my way to fix the problem:

0
source share

Missing journal entry in the ban list with dependencies. Try to download the jar "spark-core_2.11-1.5.2.logging" from the mvn repository, and then add it as an external jar to your spark project, you will not get "java.lang.NoClassDefFoundError: org / apache / spark / Registration error Based on the scala version, you can download the jar {2.10.2.11, etc.}.

-one
source share

its version problem try with the latest version

 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>2.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> <version>2.1.1</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>2.1.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.1.0</version> </dependency> 
-2
source share

All Articles