Project setup:
- 1 producer - serializes objects and sends bytes to Kafka
- 1 spark consumer - must use DefaultDecoder in kafka.serializer package to use bytes
Question:
- SBT imports the correct libraries (kafka-clients + kafka_2.10), but could not find any classes in the bank kafka_2.10.
- It seems like it is looking for the wrong path (org.apache.spark.streaming.kafka instead of org.apache.kafka).
Error message ::
object serializer is not a member of package org.apache.spark.streaming.kafka [error] import kafka.serializer.DefaultDecoder.
SBT tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1 [info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)** [info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq( organization := "org.RssReaderDemo", version := "0.1.0", scalaVersion := "2.10.6" ) resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases" val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1" val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1" val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1" // Needed to be able to parse the generated avro JSON schema val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13" val scalactic = "org.scalactic" %% "scalactic" % "2.2.6" val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test" val avro = "org.apache.avro" % "avro" % "1.8.0" lazy val root = (project in file(".")). settings(commonSettings: _*). settings( libraryDependencies += spark, libraryDependencies += sparkStreaming, libraryDependencies += sparkStreamKafka, libraryDependencies += jacksonMapperAsl, libraryDependencies += scalactic, libraryDependencies += scalatest, libraryDependencies += avro )
sbt apache-spark apache-kafka kafka-consumer-api spark-streaming
mds91
source share