`show tables like '*' does not work in Spark SQL 1.3.0+

We have an instance of Spark 1.2.0 that we can run with the show tables like 'tmp*'; command show tables like 'tmp*'; Using a beeline connected to a lean server port, no problem. We are testing Spark 1.4.0 on the same machine, but when we run the same command on Spark 1.4.0, we get the following error:

 0: jdbc:hive2://localhost:10001> show tables like 'tmp*'; Error: java.lang.RuntimeException: [1.13] failure: ``in'' expected but identifier like found show tables like 'tmp*' ^ (state=,code=0) 0: jdbc:hive2://localhost:10001> 

I pulled out Spark 1.3.0 on this computer and it gives the same error as above when running show tables like 'tmp*' .

Does anyone know if there is a similar command in Spark SQL 1.3.0+ that will allow you to use wild cards to return tables with a given template?

This was done on a machine with CDH 5.3.0. Hive 0.13.1-cdh5.3.0 , if that matters.

+5
source share
1 answer

You can use the command below in the Spark-SQL shell

 sqlContext.tables().filter("tableName LIKE '%tmp%'").collect() 
+3
source

All Articles