Is there a way to stop a Spark worker through a terminal? I know the scripts: start -all.sh, stop-all.sh, stop-workers.sh, etc. However, every time I run start -all.sh, it seems that the residual workers from the previous instance of the Spark cluster are also generated. I know this because the worker's ID contains the date and timestamp when the work was created.
So, when I start start-all.sh today, I see the same 7 or so workers that were created in early April.
Is there a way to kill these early workers? Or perhaps a grep way for their process names?
source share