How to use from_json with Kafka connect 0.10 and Spark Structured Streaming?

I tried to reproduce the example from [Databricks] [1] and apply it to the new connector for Kafka and structured streaming, however I can not parse JSON correctly using ready-made methods in Spark ...

Note: The theme is written in Kafka in JSON format.

val ds1 = spark .readStream .format("kafka") .option("kafka.bootstrap.servers", IP + ":9092") .option("zookeeper.connect", IP + ":2181") .option("subscribe", TOPIC) .option("startingOffsets", "earliest") .option("max.poll.records", 10) .option("failOnDataLoss", false) .load() 

The following code will not work, I believe the json column is a string and does not match the from_json signature method ...

  val df = ds1.select($"value" cast "string" as "json") .select(from_json("json") as "data") .select("data.*") 

Any tips?

[UPDATE] Work example: https://github.com/katsou55/kafka-spark-structured-streaming-example/blob/master/src/main/scala-2.11/Main.scala

+10
scala apache-spark apache-kafka structured-streaming apache-kafka-connect spark-structured-streaming
source share
1 answer

First you need to define a schema for your JSON message. for example

 val schema = new StructType() .add($"id".string) .add($"name".string) 

Now you can use this scheme in the from_json method, as shown below.

 val df = ds1.select($"value" cast "string" as "json") .select(from_json($"json", schema) as "data") .select("data.*") 
+19
source share

All Articles