We tried to use various combinations of settings, but mpstat shows that all or most processors are always used (in the same 8th system system)
Attempt:
install wizard for:
local[2]
send
conf.set("spark.cores.max","2")
in spark configuration
Also using
--total-executor-cores 2
and
--executor-cores 2
In all cases
mpstat -A
indicates that the entire processor is in use, not just the master.
So, I am now at a loss. We need to limit the use to a specified number of processors.
source
share