I am reading a Hive table using Spark SQL and assigning it to scala val
val x = sqlContext.sql("select * from some_table")
Then I do some processing using dataframe x and finally come up with a DataFrame y that has an exact schema like some_table.
Finally, I am trying to insert a rewrite y-frame into the same hive some_table
y.write.mode(SaveMode.Overwrite).saveAsTable().insertInto("some_table")
Then i get an error
org.apache.spark.sql.AnalysisException: Could not insert rewrite into the table, which is also read from
I tried to create an insert sql statement and run it using sqlContext.sql (), but it also gave me the same error.
Is there a way around this error? I need to insert records back into the same table.
Hi, I tried to do as suggested, but still getting the same error.
val x = sqlContext.sql("select * from incremental.test2") val y = x.limit(5) y.registerTempTable("temp_table") val dy = sqlContext.table("temp_table") dy.write.mode("overwrite").insertInto("incremental.test2") scala> dy.write.mode("overwrite").insertInto("incremental.test2") org.apache.spark.sql.AnalysisException: Cannot insert overwrite into table that is also being read from.;
source share