Deploying spark jar on kubernetes cluster

5/14/2018

I am currently trying to deploy a spark example jar on a Kubernetes cluster running on IBM Cloud. I am not very experienced with both of it, so I hope you guys can help me.

If I try to follow these instructions to deploy spark on a kubernetes cluster (https://spark.apache.org/docs/latest/running-on-kubernetes), I am not able to launch Spark Pi, because I am always getting the error message

The system cannot find the file specified

after entering the code

bin/spark-submit \
    --master k8s://<url of my kubernetes cluster> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///examples/jars/spark-examples_2.11-2.3.0.jar

although I am in the right directory, so the file is existing in the examples/jars directory.

I would be very thankful for your help!

-- Pascal
apache-spark
kubernetes

2 Answers

5/14/2018

Please make sure this absolute path /examples/jars/spark-examples_2.11-2.3.0.jar is exists.

Or you are trying loading a jar file in current directory, In this case it should be an relative path like local://./examples/jars/spark-examples_2.11-2.3.0.jar.

I'm not sure if spark-submit accepts relative path or not.

-- silverfox
Source: StackOverflow

5/15/2018

Ensure your.jar file is present inside the container image.

Instruction tells that it should be there:

Finally, notice that in the above example we specify a jar with a specific URI with a scheme of local://. This URI is the location of the example jar that is already in the Docker image.

-- VAS
Source: StackOverflow