In Apache Spark SQL, how to close metastar connection from HiveContext

My project has unit tests for different HiveContext configurations (sometimes they are in the same file because they are grouped by function.)

After upgrading to Spark 1.4, I came across a lot of "java.sql.SQLException: another instance of Derby may already have loaded database problems because the patch makes these contexts unable to share the same metastar. Since it is not clean, to return the state of one singlet for each test My only option is to β€œrework” each context by terminating the previous Derby metadata connection. Is there any way to do this?

+6
source share
1 answer

Good in scala I just used FunSuite for unit tests along with the BeforeAndAfterAll attribute. Then you can simply run your sparkContext in beforeAll, output the HiveContext from it, and end it like this:

override def afterAll(): Unit = { if(sparkContext != null) sparkContext .stop() } 

From what I noticed, it also closes the HiveContext attached to it.

+1
source

All Articles