How to exit spark-submit after submitting

When sending a sparking program using the spark-submit function (YARN mode), it continues to check the status and never exits

Is there an option in spark-submit to exit after sending?

=== why is this problem me ===

The streaming program will run forever and I do not need a status update

I can ctrl + c to stop it if I start it manually but I have a lot of streaming context to start and I need to start them with a script

I can put the fix-send program in the background, but after creating a large java process, the corresponding user will not be able to start any other Java process, since the JVM cannot create the GC thread

+13
yarn apache-spark
source share
3 answers

Interesting. I never thought about this problem. Not sure if there is a clean way to do this, but I just kill the sending process by machine, and working with yarn continues to work until you stop it. That way you can create a script that execute the lawsuit, and then kill it. When you really want to stop working, use -kill yarn. Dirty but working.

+1
source share

I know this is an old question, but there is a way to do it now by setting --conf spark.yarn.submit.waitAppCompletion=false when you use spark-submit . In this case, the client will exit after successfully submitting an application.

In cluster mode, YARN monitors whether the client is waiting for an exit until the application terminates. If set to true, the client process will remain alive, reporting the status of the application. Otherwise, the client process will end after sending.

You may also need to install --deploy-mode for cluster

In cluster mode, the Spark driver starts in the main application process, which is controlled by YARN in the cluster, and the client may leave after the application starts.

More at https://spark.apache.org/docs/latest/running-on-yarn.html

+64
source share

timeout TIME CMD will close CMD after TIME

0
source share

All Articles