How to set spark.local.dir property from spark shell?

I am trying to install spark.local.dir from a spark shell using sc.getconf.set("spark.local.dir","/temp/spark") , but it does not work. Is there any other way to set this property from sparkshell.

+7
scala apache-spark
source share
2 answers

You cannot do this from within the shell - since the Spark context has already been created, so the local directory has already been installed (and is used). You should pass it as a parameter when to start the shell:

 ./spark-shell --conf spark.local.dir=/temp/spark 
+7
source share

@ Tzach Zohar's solution seems to be the right answer.

However, if you insist on installing spark.local.dir from the spark shell, you can do this:

1) close the current spark context

  sc.stop() 

2) updated the sc configuration and restarted it.

Updated code was kindly provided by @ Tzach-Zohar:

 SparkSession.builder.config(sc.getConf).config("spark.local.β€Œβ€‹dir","/temp/spark").β€Œβ€‹getOrCreate()) 

@Tzach Zohar note: "but you get WARN SparkContext: use the existing SparkContext, some configuration may not take effect, which suggests that this is not the recommended way.

+3
source share

All Articles