@ Tzach Zohar's solution seems to be the right answer.
However, if you insist on installing spark.local.dir from the spark shell, you can do this:
1) close the current spark context
sc.stop()
2) updated the sc configuration and restarted it.
Updated code was kindly provided by @ Tzach-Zohar:
SparkSession.builder.config(sc.getConf).config("spark.local.ββdir","/temp/spark").ββgetOrCreate())
@Tzach Zohar note: "but you get WARN SparkContext: use the existing SparkContext, some configuration may not take effect, which suggests that this is not the recommended way.
Yaron
source share