I have a very long Spark job, of which a small number of tasks are currently stalled. Is there any way to kill these running tasks from the node driver?
For permission reasons, I can log in, but not kill the jobs on the subordinate nodes, so I'm looking for a way to do this only from the node driver. Please note that I do not want to kill all the work of Spark - only one or two tasks with a stop.
If this helps, I use Mesos and have access to the web interface, but there is no way to kill the task.
distributed-computing apache-spark mesos
nbubis
source share