How to set up spark.io.compression.codec = lzf in Spark

How to set spark.io.compression.codec = lzf in Spark?

I usually use the spark-submit function to run our driver class, as shown below

./spark-submit --master spark: // testserver: 7077 - class com.spark.test.SparkTest --conf " spark.io.compression.codec = lzf " /tmp/test/target/test.jar.

Therefore, I can set spark.io.compression.codec = lzf in the command. But if I do not want to use spark-submit to run our driver class. I want to work on a spark server. How to configure in spark-job-server? I tried to set it in env variables. But that will not work. I also tried below. Still not working.

sparkConf = new SparkConf().setMaster("spark://testserver:7077").setAppName("Javasparksqltest").
set("spark.executor.memory", "8g").set("spark.io.compression.codec", "lzf");
+4
source share
2

spark-submit spark-shell, conf/spark-defaults.conf, . .

spark-jobserver , , . ( , spark.context), " " Readme.md , :

https://github.com/spark-jobserver/spark-jobserver/blob/master/README.md

0

"org.apache.spark.io.LZFCompressionCodec" "lzf"

0

All Articles