Datastax Cassandra Driver Throws CodecNotFoundException

The exact exception is as follows

com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for the requested operation: [varchar โ†” java.math.BigDecimal]

This is the software version that I am using Spark 1.5 Datastax-cassandra 3.2.1 CDH 5.5.1

The code I'm trying to execute is a Spark program using java api, and it basically reads the data (csv) from hdfs and loads it into the cassandra tables. I use the spark-cassander connector. At first, I had a lot of problems related to the conflict of the goava gava library, which I could resolve by obscuring the guava library and creating a jar of snap-ins with all the dependencies.

However, I was able to load data for some files, but for some files I get a Codec Exception. When I researched this issue, I got the following topics on the same issue.

https://groups.google.com/a/lists.datastax.com/forum/#!topic/java-driver-user/yZyaOQ-wazk

https://groups.google.com/a/lists.datastax.com/forum/#!topic/java-driver-user/yZyaOQ-wazk

After going through these discussions, I understand that this is the wrong version of the cassandra driver that I am using. Or there is still a class problem associated with the guava library, since cassandra 3.0 and later use guava 16.0.1, and the above discussions say there may be a lower version of guava present in the class path.

Here is the pom.xml file

<dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector-java_2.10</artifactId> <version>1.5.0-M3</version> </dependency> <dependency> <groupId>org.apache.cassandra</groupId> <artifactId>cassandra-clientutil</artifactId> <version>3.2.1</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>2.3</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <filters> <filter> <artifact>*:*</artifact> <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> </excludes> </filter> </filters> <relocations> <relocation> <pattern>com.google</pattern> <shadedPattern>com.pointcross.shaded.google</shadedPattern> </relocation> </relocations> <minimizeJar>false</minimizeJar> <shadedArtifactAttached>true</shadedArtifactAttached> </configuration> </execution> </executions> </plugin> </plugins> </build> </project> 

and these are dependencies that were loaded using the above pom

 spark-core_2.10-1.5.0.jar spark-cassandra-connector- java_2.10-1.5.0-M3.jar spark-cassandra-connector_2.10-1.5.0-M3.jar spark-repl_2.10-1.5.1.jar spark-bagel_2.10-1.5.1.jar spark-mllib_2.10-1.5.1.jar spark-streaming_2.10-1.5.1.jar spark-graphx_2.10-1.5.1.jar guava-16.0.1.jar cassandra-clientutil-3.2.1.jar cassandra-driver-core-3.0.0-alpha4.jar 

Above are some of the main dependencies in my snap bank.

Y - CodecNotFoundException exception? Is it because of the class path (guava)? or cassandra-driver (cassandra-driver-core-3.0.0-alpha4.jar for datastax cassandra 3.2.1) or because of the code.

Another point is all the dates that I insert in columns whose data type is a timestamp.

Also, when I do spark-submit, I see the class path in the logs. There are other versions of guava that are under libo files. R that cause the problem?

As we indicate the path to a specific user when we execute the spark-submit function. Will this help?

I would be glad to receive some comments on these issues. thanks

Below is the stacktrace table

 com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [timestamp <-> java.lang.String] at com.datastax.driver.core.CodecRegistry.notFound(CodecRegistry.java:689) at com.datastax.driver.core.CodecRegistry.createCodec(CodecRegistry.java:550) at com.datastax.driver.core.CodecRegistry.findCodec(CodecRegistry.java:530) at com.datastax.driver.core.CodecRegistry.codecFor(CodecRegistry.java:485) at com.datastax.driver.core.AbstractGettableByIndexData.codecFor(AbstractGettableByIndexData.java:85) at com.datastax.driver.core.BoundStatement.bind(BoundStatement.java:198) at com.datastax.driver.core.DefaultPreparedStatement.bind(DefaultPreparedStatement.java:126) at com.cassandra.test.LoadDataToCassandra$1.call(LoadDataToCassandra.java:223) at com.cassandra.test.LoadDataToCassandra$1.call(LoadDataToCassandra.java:1) at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1027) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 

I also got

 com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [Math.BigDecimal <-> java.lang.String] 
+7
guava datastax-java-driver cloudera-cdh datastax spark-cassandra-connector
source share
2 answers

When you call bind(params...) in PreparedStatement , the driver expects you to provide values โ€‹โ€‹of w / java types that map to cql types.

This error ( [timestamp <-> java.lang.String] ) tells you that such a Codec is not registered, which maps java String to cql timestamp . In the java driver, the type timestamp displayed in java.util.Date . So, you have 2 options:

  • If the column binding refers to a timestamp, specify Date -typed instead of String .
  • Create a codec that displays a timestamp <-> String . To do this, you can subclass the MappingCodec class, as described on the documentation site , which maps String to timestamp:
 public class TimestampAsStringCodec extends MappingCodec<String, Date> { public TimestampAsStringCodec() { super(TypeCodec.timestamp(), String.class); } @Override protected Date serialize(String value) { ... } @Override protected String deserialize(Date value) { ... } } 

Then you will need to register Codec:

 cluster.getConfiguration().getCodecRegistry() .register(new TimestampAsStringCodec()); 
+11
source share

The best solution is provided here.

The correct mappings offered by the driver for temporary types are as follows:

  DATE <-> com.datastax.driver.core.LocalDate : use getDate() 
-one
source share

All Articles