As long as you have a wizard and an available worker, you should be able to, if your Java application has the following:
String master = "spark://IP:7077"; //set IP address to that of your master String appName = "Name of your Application Here"; SparkConf conf = new SparkConf().setAppName(appName).setMaster(master);; JavaSparkContext sc = new JavaSparkContext(conf);
I managed to run junit tests from IntelliJ that used JavaSparkContext, without having to use a spark-submit script. I am encountering problems when doing actions in DataFrames (not sure if this is related).
insomniak
source share