I am using spark in java project. I tried to use spark-jobserver as our web service. But I had to face great difficulties. Is there any java project project for using spark-work server?
My question: It seems only needs to implement SparkJob in our java class and override the runJob method. But I find that in the runJob method the type of sparkcontext is SparkContext, not the JavaSparkContext that we used in our java project.
public Object runJob(SparkContext sparkcontext, Config config) {
return null;
}
So, I changed JavaSparkContext to SparkContext in our job class. But I ran into another incriminating problem. Please help me fix it. Thanks
before change:
JavaPairRDD<String, Row> pairData;
.................
JavaSchemaRDD schemaRDD = sqlContext.applySchema(pairData.values(), schema);
after change
RDD<Tuple2<String, Row>> pairData;
...............................
RDD<Row> rowRDD = pairData .map(new Function<Tuple2<String, Row>, Row>() {
public Row call(Tuple2<String, Row> re)
throws Exception {
return re._2();
}
});
, .map. , , , - org.apache.spark.sql.api.java, JavaSparkContext SparkContext .
Java-?