How to insert data into RDB (MySQL) using Spark?

I am trying to insert data into a table MySQLthrough Spark SQL.

Here is my table:

CREATE TABLE images (
  id    INT          NOT NULL AUTO_INCREMENT,
  name  VARCHAR(100) NOT NULL,
  data  LONGBLOB     NOT NULL
);

and my code is Spark:

case class Image(name: String, data: Array[Byte])

def saveImage(image: Image): Unit = {
  sqlContext.sql(s"""INSERT INTO images (name, data) VALUES ('${image.name}', ${image.data});""".stripMargin)
}

But I get an error message:

java.lang.RuntimeException: [1.13] failure: ``table'' expected but identifier images found
INSERT INTO images (name, data)
            ^

What is wrong with my code?

+4
source share
1 answer

Finally, I found a solution. I can use the trick to save data in MySQL using Spark SQL. The trick is to create a new DataFrame and then save it. Here is an example:

def saveImage(image: Image): Unit = {
  val df = sqlContext.createDataFrame {
    sc.parallelize(
      Image(
        name = image.name,
        data = image.data
      ) :: Nil
    )
  }

  JdbcUtils.saveTable(df, url, "images", props)      
}

And the model will look like this:

case class Image(
  id   : Option[Int] = None,
  name : String,
  data : Array[Byte]
)
+2
source

All Articles