How to set spark.io.compression.codec = lzf in Spark?
I usually use the spark-submit function to run our driver class, as shown below
./spark-submit --master spark: // testserver: 7077 - class com.spark.test.SparkTest --conf " spark.io.compression.codec = lzf " /tmp/test/target/test.jar.
Therefore, I can set spark.io.compression.codec = lzf in the command. But if I do not want to use spark-submit to run our driver class. I want to work on a spark server. How to configure in spark-job-server? I tried to set it in env variables. But that will not work. I also tried below. Still not working.
sparkConf = new SparkConf().setMaster("spark://testserver:7077").setAppName("Javasparksqltest").
set("spark.executor.memory", "8g").set("spark.io.compression.codec", "lzf");
source
share