Spark 1.5.1, Cassandra Connector 1.5.0-M2, Cassandra 2.1, Scala 2.10, NoSuchMethodError guava dependency

New in Spark (and quite new for Maven), so I'm struggling with how to properly send the dependencies I need.

It seems that Spark 1.5.1 has a dependency on guava-14.0.1, which it is trying to use, and isPrimitive was added in 15+. What is the right way to ensure uber-jar wins? I tried spark.executor.extraClassPath in my spark-defaults.conf to no avail.

Duplicate this [question]: Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError: but for Maven in essence (don 'haven't submitted a comment yet)

Divide my dependencies into this:

  <dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>18.0</version> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-compress</artifactId> <version>1.10</version> </dependency> <dependency> <groupId>com.esotericsoftware.kryo</groupId> <artifactId>kryo</artifactId> <version>2.21</version> </dependency> <dependency> <groupId>org.objenesis</groupId> <artifactId>objenesis</artifactId> <version>2.1</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.5.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector_2.10</artifactId> <version>1.5.0-M2</version> </dependency> 

Painted up my JAR with all the dependencies using this:

  <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>2.3</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <artifactSet> <excludes> <exclude>org.apache.hadoop:*</exclude> <exclude>org.apache.hbase:*</exclude> </excludes> </artifactSet> <filters> <filter> <artifact>*:*</artifact> <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> </excludes> </filter> <filter> <artifact>org.apache.spark:spark-network-common_2.10</artifact> <excludes> <exclude>com.google.common.base.*</exclude> </excludes> </filter> </filters> <transformers> <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer"> <!-- merge multiple reference.conf files into one --> <resource>reference.conf</resource> </transformer> </transformers> </configuration> </execution> </executions> </plugin> 

Here is my terrific blast when I run

./spark-submit --master local --class <my main class> <my shaded jar>

 Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142) at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136) at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609) at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606) at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147) at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259) at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135) at com.datastax.driver.core.Cluster.<init>(Cluster.java:111) at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178) at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152) at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155) 
0
java scala maven apache-spark spark-cassandra-connector
source share
1 answer

The problem with dependencies is fixed using tough tasks, including the guava banks that I need in / conf / spark -defaults.conf.

 spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar 
+2
source share

All Articles