Filter the Spark DataFrame by checking if there is a value in the list, with other criteria

As a simplified example, I tried to filter the Spark DataFrame with the following code:

val xdf = sqlContext.createDataFrame(Seq( ("A", 1), ("B", 2), ("C", 3) )).toDF("name", "cnt") xdf.filter($"cnt" >1 || $"name" isin ("A","B")).show() 

Then these are the errors:

 org.apache.spark.sql.AnalysisException: cannot resolve '((cnt > 1) || name)' due to data type mismatch: differing types in '((cnt > 1) || name)' (boolean and string).; 

What is the right way to do this? It seems to me that it stops reading after the name column. Is this a bug in the parser? I use spark 1.5.1

+13
scala apache-spark apache-spark-sql spark-dataframe
source share
2 answers

You must bracket individual expressions:

 xdf.filter(($"cnt" > 1) || ($"name" isin ("A","B"))).show() 
+22
source share

We hope this helps you:

 val list = List("x","y","t") xdf.filter($"column".isin(list:_*)) 
+29
source share

All Articles