We run the Spark task through spark-submit , and I see that the task will be resubmitted in the event of a failure.
How can I prevent attempt No. 2 in case of failure of the container from under the yarn or any exception?

This was due to a lack of memory and the "GC Limit Exceeded" problem.
scala yarn apache-spark
codingtwinky
source share