Deploy Spark into Kubernetes Cluster

10/3/2018

I'm newbie in Kubernetes & Spark Environment. I'm requested to deploy Spark inside Kubernetes so that it's can be auto Horizontal Scalling.

The problem is, I can't deploy SparkPi example from official website(https://spark.apache.org/docs/latest/running-on-kubernetes#cluster-mode).

I've already follow the instruction, but the pods failed to execute. Here is the explanation :

  1. Already run : Kubectl proxy Kubectl proxy

  2. When execute :

spark-submit --master k8s://https://localhost:6445 --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=xnuxer88/spark-kubernetes-bash-test-entry:v1 local:///opt/spark/examples/jars/spark-examples_2.11-2.3.2.jar

Get Error : Error: Could not find or load main class org.apache.spark.examples.SparkPi When executing Spark-Submit Pods Logs

  1. When I check the docker image (create the container from related image), I found the file. File exists

Is there any missing instruction that I forgot to follow?

Please Help.

Thank You.

-- Xnuxer
apache-spark
kubernetes

0 Answers