Below, perhaps option 3 is the one you are looking for.
- Option 1 : Spark web ui gives me information about common kernels and used kernels.

Option 2 : default values:
sc.defaultParallelism usually set to the number of work cores in your cluster
Option 3 . You can use ExectorInfo.totalCores , as shown below, and try ... it should work.
docs says
public class ExecutorInfo extends Object SparkListeners.
import org.apache.spark.scheduler.{SparkListener, SparkListenerExecutorAdded}
final class ExecutorLogger extends SparkListener {
override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit =
println(s"\rExecutor ${executorAdded.executorId} added: ${executorAdded.executorInfo.executorHost} ${executorAdded.executorInfo.totalCores} cores")
}