I also followed the example in the first document that you linked. You will notice that in the Prerequisites section, step # 2 requires you to create an example as a Maven project. Step 3 lists the four dependencies that you need to add to the project. Two of these dependencies are specific to the Spark Connector:
- com.datastax.spark: spark Cassandra-connector_2.10: 1.0.0-RC4
- com.datastax.spark: spark Cassandra-connector-java_2.10: 1.0.0-RC4
Basically, the pom.xml "dependencies" pom.xml for my Spark projects is as follows:
<dependencies> <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector_2.10</artifactId> <version>1.1.0-alpha2</version> </dependency> <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector-java_2.10</artifactId> <version>1.1.0-alpha2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.1.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.1.0</version> </dependency> </dependencies>
Double check that your pom.xml has these dependencies and then call Maven to distribute the Spark Connector libraries locally. This worked for me:
cd workspace/sparkTest2 mvn package
Aaron
source share