Get a list of data types from a schema in Apache Spark

I have the following code in Spark-Python to get a list of names from a DataFrame schema that works fine, but how can I get a list of data types?

columnNames = df.schema.names 

For example, something like:

 columnTypes = df.schema.types 

Is there a way to get a separate list of data types contained in a DataFrame schema?

+6
source share
1 answer

Here is a suggestion:

 df = sqlContext.createDataFrame([('a', 1)]) types = [f.dataType for f in df.schema.fields] types > [StringType, LongType] 

Link:

+9
source

All Articles