How to use spark server for Java Project?

I am using spark in java project. I tried to use spark-jobserver as our web service. But I had to face great difficulties. Is there any java project project for using spark-work server?

My question: It seems only needs to implement SparkJob in our java class and override the runJob method. But I find that in the runJob method the type of sparkcontext is SparkContext, not the JavaSparkContext that we used in our java project.

public Object runJob(SparkContext sparkcontext, Config config) {
    // TODO Auto-generated method stub
    return null;
}

So, I changed JavaSparkContext to SparkContext in our job class. But I ran into another incriminating problem. Please help me fix it. Thanks

before change:

JavaPairRDD<String, Row> pairData;

.................
JavaSchemaRDD schemaRDD = sqlContext.applySchema(pairData.values(), schema);

after change

// because we I need to use SparkContext, so I changed JavaPairRDD<String, Row> to  RDD<Tuple2<String, Row>>.

RDD<Tuple2<String, Row>> pairData;
...............................


// I can't use pairData.values(), So I transform pairData to rowRDD


RDD<Row> rowRDD = pairData .map(new Function<Tuple2<String, Row>, Row>() {
 public Row call(Tuple2<String, Row> re)
 throws Exception {
 return re._2();
 }

  });

, .map. , , , - org.apache.spark.sql.api.java, JavaSparkContext SparkContext . Java-?

+1
2

JavaSparkContext SparkContext , JavaSparkContext.sc() JavaSparkContext.fromSparkContext().

0

All Articles