Manual Stop Spark Workers

Is there a way to stop a Spark worker through a terminal? I know the scripts: start -all.sh, stop-all.sh, stop-workers.sh, etc. However, every time I run start -all.sh, it seems that the residual workers from the previous instance of the Spark cluster are also generated. I know this because the worker's ID contains the date and timestamp when the work was created.

So, when I start start-all.sh today, I see the same 7 or so workers that were created in early April.

Is there a way to kill these early workers? Or perhaps a grep way for their process names?

+6
source share
1 answer

This happened to me in the past, and what I usually do:

1) Find the process id:

ps aux | grep spark 

2) And kill him:

 sudo kill pid1 
+2
source

All Articles