Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/Watcher

4/22/2020

I am following this tutorial to run Spark-Pi Application using kubectl command from here. https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md#running-the-examples.

When I use spark-on-operator to submit a spark task, an error occurs. I use kubectl apply -f examples/spark-pi.yaml to submit the job,but it failed. And my spark version is 2.4.4 and the spark-on-operator version is v1beta2-1.0.1-2.4.4. Here are the messages:

Warning  SparkApplicationFailed  3m  spark-operator  SparkApplication spark-pi failed: failed to run spark-submit for SparkApplication default/spark-pi: Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/Watcher
       at java.lang.ClassLoader.defineClass1(Native Method)
       at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
       at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
       at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
       at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
       at java.security.AccessController.doPrivileged(Native Method)
       at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
       at java.lang.ClassLoader.defineClass1(Native Method)
       at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
       at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
       at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
       at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
       at java.security.AccessController.doPrivileged(Native Method)
       at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
       at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:233)
       at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:204)
       at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:845)
       at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
       at org.apache.spark.deploy.SparkSubmit$anon$2.doSubmit(SparkSubmit.scala:920)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: 
  io.fabric8.kubernetes.client.Watcher
  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 33 more

Afterwards, I tried to modify the Dockerfile, add kubernetes-client-4.1.2.jar to the yaml file, and re-create the docker image. But the error message is the same.My Dockerfile is :

ARG spark_image=gcr.io/spark-operator/spark:v2.4.4

FROM $spark_image

RUN mkdir -p /opt/spark/jars
ADD https://repo1.maven.org/maven2/io/fabric8/kubernetes- 
client/4.1.2/kubernetes-client-4.1.2.jar /opt/spark/jars
ENV SPARK_HOME /opt/spark
WORKDIR /opt/spark/work-dir
ENTRYPOINT [ "/opt/entrypoint.sh" ]

Anybody help woud be appreciate

-- ztcheck
apache-spark
kubernetes

1 Answer

4/23/2020

Spark-on-operator requires that the version of k8s is 1.13+, but in the openshift I use, the version of k8s is 1.11+, so replacing it with a higher version of k8s can solve the problem.

-- ztcheck
Source: StackOverflow