Skip to content

Commit c352398

Browse files
kotlovsfishcus
authored andcommitted
[SPARK-34674][CORE][K8S] Close SparkContext after the Main method has finished
### What changes were proposed in this pull request? Close SparkContext after the Main method has finished, to allow SparkApplication on K8S to complete. This is fixed version of [merged and reverted PR](apache#32081). ### Why are the changes needed? if I don't call the method sparkContext.stop() explicitly, then a Spark driver process doesn't terminate even after its Main method has been completed. This behaviour is different from spark on yarn, where the manual sparkContext stopping is not required. It looks like, the problem is in using non-daemon threads, which prevent the driver jvm process from terminating. So I have inserted code that closes sparkContext automatically. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Manually on the production AWS EKS environment in my company. Closes apache#32283 from kotlovs/close-spark-context-on-exit-2. Authored-by: skotlov <skotlov@joom.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com> (cherry picked from commit b17a0e6) Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
1 parent 3c35c38 commit c352398

1 file changed

Lines changed: 9 additions & 0 deletions

File tree

core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -956,6 +956,15 @@ private[spark] class SparkSubmit extends Logging {
956956
} catch {
957957
case t: Throwable =>
958958
throw findCause(t)
959+
} finally {
960+
if (!isShell(args.primaryResource) && !isSqlShell(args.mainClass) &&
961+
!isThriftServer(args.mainClass)) {
962+
try {
963+
SparkContext.getActive.foreach(_.stop())
964+
} catch {
965+
case e: Throwable => logError(s"Failed to close SparkContext: $e")
966+
}
967+
}
959968
}
960969
}
961970

0 commit comments

Comments
 (0)