TL DR
Spark UI shows a different number of cores and memory than what I ask for when using spark-submit
more details:
I am running Spark 1.6 offline. When I run spark-submit, I give it 1 instance of the executor with 1 core for the executor, as well as 1 core for the driver. I would expect my application to work with a total of 2 cores. When I check the environment tab in the user interface, I see that he got the correct parameters that I gave them, however it still seems like he is using a different number of cores. You can see it here:

This is my spark-defaults.conf that I use:
spark.executor.memory 5g spark.executor.cores 1 spark.executor.instances 1 spark.driver.cores 1
Checking the environment bookmarks in the Spark interface shows that these are really accepted parameters, but the user interface still shows something else
Does anyone have any ideas on what could make Spark use a different number of cores than what I want to convey? I obviously tried using it, but did not find anything useful in this thread
Thank you in advance
source share