Spark> = 2.2
You can skip unix_timestamp and use and use to_date or to_timestamp :
from pyspark.sql.functions import to_date, to_timestamp df_test.withColumn("date", to_date("date", "dd-MMM-yy")).show() ## +---+----------+ ## | id| date| ## +---+----------+ ## | 1|2015-07-14| ## | 2|2015-06-14| ## | 3|2015-10-11| ## +---+----------+ df_test.withColumn("date", to_timestamp("date", "dd-MMM-yy")).show() ## +---+-------------------+ ## | id| date| ## +---+-------------------+ ## | 1|2015-07-14 00:00:00| ## | 2|2015-06-14 00:00:00| ## | 3|2015-10-11 00:00:00| ## +---+-------------------+
and then apply the other datetime functions shown below.
Spark & lt; 2.2
It is not possible to get multiple top-level columns in one access. You can use structures or collection types using UDF as follows:
from pyspark.sql.types import StringType, StructType, StructField from pyspark.sql import Row from pyspark.sql.functions import udf, col schema = StructType([ StructField("day", StringType(), True), StructField("month", StringType(), True), StructField("year", StringType(), True) ]) def split_date_(s): try: d, m, y = s.split("-") return d, m, y except: return None split_date = udf(split_date_, schema) transformed = df_test.withColumn("date", split_date(col("date"))) transformed.printSchema()
but it is not only quite verbose in PySpark, but also expensive.
For date-based conversions, you can simply use the built-in functions:
from pyspark.sql.functions import unix_timestamp, dayofmonth, year, date_format transformed = (df_test .withColumn("ts", unix_timestamp(col("date"), "dd-MMM-yy").cast("timestamp")) .withColumn("day", dayofmonth(col("ts")).cast("string")) .withColumn("month", date_format(col("ts"), "MMM")) .withColumn("year", year(col("ts")).cast("string")) .drop("ts"))
Similarly, you can use regexp_extract to split the date.
See also Output multiple columns from one column in a Spark DataFrame
Note
If you use a version that is not patched with SPARK-11724 , this will require a fix after unix_timestamp(...) and before cast("timestamp") .