spark-on-operator occus io/fabric8/kubernetes/client/Watcher

4/22/2020

When I use spark-on-operator to submit a spark task, an error occurs. I use kubectl apply -f examples/spark-pi.yaml to submit the job,but it failed. And my spark version is 2.4.4 and the spark-on-operator version is v1beta2-1.0.1-2.4.4. Here are the messages:

spark-operator  failed to submit SparkApplication spark-pi: failed to run spark-submit for SparkApplication default/spark-pi: Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/Watcher.

Afterwards, I tried to modify the Dockerfile, add kubernetes-client-4.1.2.jar to the yaml file, and re-create the docker image. But the error message is the same.My Dockerfile is :

ARG spark_image=gcr.io/spark-operator/spark:v2.4.4

FROM $spark_image

RUN mkdir -p /opt/spark/jars
ADD https://repo1.maven.org/maven2/io/fabric8/kubernetes- 
client/4.1.2/kubernetes-client-4.1.2.jar /opt/spark/jars
ENV SPARK_HOME /opt/spark
WORKDIR /opt/spark/work-dir
ENTRYPOINT [ "/opt/entrypoint.sh" ]

Anybody help woud be appreciate

-- ztcheck
apache-spark
docker
kubernetes

0 Answers