Where to place the Spark application while spark-submitting it to Kubernetes?

7/7/2018

I have the same case as in this topic. Spark on K8s - getting error: kube mode not support referencing app depenpendcies in local

I run Spark from a container. https://github.com/gettyimages/docker-spark/blob/master/Dockerfile

bin/spark-submit \
--master k8s://https://kubernetes:6443 \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.kubernetes.namespace=spark \
--conf spark.executor.instances=5 \
--conf spark.kubernetes.container.image=gcr.io/cloud-solutions-images/spark:v2.3.0-gcs \
--conf spark.kubernetes.authenticate.submission.caCertFile=/var/run/secrets/kubernetes.io/serviceaccount/k8.crt \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \ 
local:///usr/spark-2.3.0/examples/jars/spark-examples_2.11-2.3.0.jar

Error:

Exception in thread "main" org.apache.spark.SparkException: The Kubernetes mode does not yet support referencing application dependencies in the local file system.
        at org.apache.spark.deploy.k8s.submit.DriverConfigOrchestrator.getAllConfigurationSteps(DriverConfigOrchestrator.scala:122)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$anonfun$run$5.apply(KubernetesClientApplication.scala:229)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$anonfun$run$5.apply(KubernetesClientApplication.scala:227)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2585)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:227)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:879)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-07-07 05:56:27 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-07-07 05:56:27 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-67367f1a-4ea8-43d6-98e4-23bd6015c6a6

The possible solution there is not fully demonstrated.

I do not know what to do? How to fix? Spark version 2.3.0.

I tried to download a new version of the spark-kubernetes jar in spark/jars.

Copied and renamed spark-kubernetes_2.11-2.3.1.jar -> spark-kubernetes_2.11-2.3.0.jar.

After this fix Spark does not find the corresponding kubernetes files.

-- JDev
apache-spark
kubernetes
scala
spark-submit

1 Answer

7/30/2018

this is command that worked for me:

/opt/spark/bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master k8s://https://$KUBERNETES_SERVICE_HOST:$KUBERNETES_SERVICE_PORT \
--conf spark.kubernetes.namespace=myproject \
--deploy-mode cluster \
--conf spark.app.name=my-spark-app \
--conf spark.kubernetes.container.image=jkremser/openshift-spark:2.3-latest \
--conf spark.kubernetes.submission.waitAppCompletion=false \
--conf spark.kubernetes.driver.label.radanalytics.io/app=my-spark-app \
--conf spark.driver.cores=0.100000 \
--conf spark.kubernetes.driver.limit.cores=200m \
--conf spark.driver.memory=512m \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-operator \
--conf spark.kubernetes.driver.label.version=2.3.0 \
--conf spark.kubernetes.executor.label.sparkoperator.k8s.io/app-name=my-spark-app \
--conf spark.executor.instances=1 \
--conf spark.executor.cores=1 \
--conf spark.executor.memory=512m \
--conf spark.kubernetes.executor.label.version=2.3.0 \
--conf spark.jars.ivy=/tmp/.ivy2 \
local:///opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar

Make sure the /usr/spark-2.3.0/examples/jars/spark-examples_2.11-2.3.0.jar file is present in the container image (gcr.io/cloud-solutions-images/spark:v2.3.0-gcs in your case), because it shouldn't point to the local file, but to a file in the container.

You can do that by something like:

docker run --rm -ti gcr.io/cloud-solutions-images/spark:v2.3.0-gcs sh

and inspect what's there (where is the jar with the examples).

Good luck.

-- Jiri Kremser
Source: StackOverflow