I work in a spark shell (Spark version 2.1.0 using Scala version 2.11.8, the 64-bit server version of OpenJDK, 1.7.0_151).
I am importing a Columnclass:
scala> import org.apache.spark.sql.Column
import org.apache.spark.sql.Column
I can define an object Column:
scala> val myCol: Column = col("blah")
myCol: org.apache.spark.sql.Column = blah
and use Columnin the function definition:
scala> def myFunc(c: Column) = ()
myFunc: (c: org.apache.spark.sql.Column)Unit
So far so good. But in class definition Columnnot found:
scala> case class myClass(c: Column)
<console>:11: error: not found: type Column
case class myClass(c: Column)
However, one liner works:
scala> case class myClass(c: org.apache.spark.sql.Column)
defined class myClass
or
scala> import org.apache.spark.sql.Column; case class myClass(c: Column)
import org.apache.spark.sql.Column
defined class myClass
source
share