How to send sparking from a java program to a standalone spark block without using spark-submit?

I use a spark to do some calculations, but I want it to be sent from a java application. It works correctly when sent using a spark-submit script. Has anyone tried to do this?

Thanks.

+7
java apache-spark
source share
2 answers

Remember to add the bold JAR containing your code to the context.

val conf = new SparkConf() .setMaster(...) .setAppName(...) .setJars("/path/to/code.jar") val sc = new SparkContext(conf) 
+5
source share

As long as you have a wizard and an available worker, you should be able to, if your Java application has the following:

 String master = "spark://IP:7077"; //set IP address to that of your master String appName = "Name of your Application Here"; SparkConf conf = new SparkConf().setAppName(appName).setMaster(master);; JavaSparkContext sc = new JavaSparkContext(conf); 

I managed to run junit tests from IntelliJ that used JavaSparkContext, without having to use a spark-submit script. I am encountering problems when doing actions in DataFrames (not sure if this is related).

+2
source share

All Articles