spark-submit does not terminate driver when process terminates on Kubernetes

1/1/2019

I am using Spark (2.4.0) on Kubernetes, if I run the following from a pod:

$SPARK_HOME/bin/spark-submit \
    --master k8s://https://master-node \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.executor.cores=1 \
    --conf spark.kubernetes.container.image=container-image \
    local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar 1000000

and exit the process with ctrl+c, the application (driver/executors) keeps running. This is an issue that I have with streaming jobs, when the client pod terminates, the application keeps running.

-- jamborta
apache-spark
kubernetes

0 Answers