Time synchronization with CSV in Zeppelin and Spark

I am trying to read a CSV file and create a data frame.

CSV format like a punch. I used the ISO8602 date / time format to represent data / time string.

2015-6-29T12:0:0,b82debd63cffb1490f8c9c647ca97845,G1J8RX22EGKP,2015-6-29T12:0:5,2015-6-29T12:0:6,0QA97RAM1GIV,2015-6-29T12:0:10,2015-6-29T12:0:11,2015-6-29T12:0:12,2015-6-29T12:5:42,1 2015-6-29T12:20:0,0d60c871bd9180275f1e4104d4b7ded0,5HNB7QZSUI2C,2015-6-29T12:20:5,2015-6-29T12:20:6,KSL2LB0R6367,2015-6-29T12:20:10,2015-6-29T12:20:11,2015-6-29T12:20:12,2015-6-29T12:25:13,1 ...... 

To download this data, I wrote scala code in Zeppelin, as shown below

 import org.apache.spark.sql.types.DateType import org.apache.spark.sql.functions._ import org.joda.time.DateTime import org.joda.time.format.DateTimeFormat import sys.process._ val logCSV = sc.textFile ("log_table.csv") case class record( callingTime:DateTime, userID:String, CID:String, serverConnectionTime:DateTime, serverResponseTime:DateTime, connectedAgentID:String, beginCallingTime:DateTime, endCallingTime:DateTime, Succeed:Int) val formatter = DateTimeFormat.forPattern("yyyy-mm-dd'T'kk:mm:ss") val logTable = logCSV.map(s => s.split(",") ).map( s => record( formatter.parseDateTime( s(0) ), s(1), s(2), formatter.parseDateTime( s(3) ), formatter.parseDateTime( s(4) ), s(5), formatter.parseDateTime( s(6) ), formatter.parseDateTime( s(7) ), s(8).toInt ) ).toDF() 

This made a mistake as shown below. The main question is - DateTime is not serializable.

 logCSV: org.apache.spark.rdd.RDD[String] = log_table.csv MapPartitionsRDD[38] at textFile at <console>:169 defined class record formatter: org.joda.time.format.DateTimeFormatter = org.joda.time.format.DateTimeFormatter@46051d99 org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158) at org.apache.spark.SparkContext.clean(SparkContext.scala:1623) at org.apache.spark.rdd.RDD.map(RDD.scala:286) 

Then I wonder how I process date / time information in Scala. Could you help me?

+5
source share
3 answers

As long as DateTime is not serializable, if you use parseMillis DateTimeFormatter , you will get a long one that will be a free bridge to Long, which is Serializable. To return a DateTime from Long, use the DateTime(longInstance.longValue()) constructor DateTime(longInstance.longValue()) .

+2
source

You have encountered a problem that your formatter is not serializable. Instead, you can build a formatter inside the map (or you can use mapPartitions and build it inside mapPartitions, so you only need to build one formatter for each section).

+1
source

Thanks for the answers from everyone !! I decided to use Timestamp because it is possible for serialization and a Dataframe. I revised the code as shown below.

 import java.sql.Timestamp case class Record( callingTime:Timestamp, userID:String, CID:String, succeed:Int) val dataFrame = logCSV.map( _.split(",") ).map( r => Record( Timestamp.valueOf(r(0).replace("T", " ") ), r(1), r(2), r(10).toInt ) ).toDF() dataFrame.registerTempTable("dataFrame") 

The date / time format in my data is ISO8601 . So I need to change 'T' to '' for Timestamp . Then you can use Timestamp.valueof .

+1
source

All Articles