I cannot run a custom Spark application on Kubernetes.
I have followed the setup https://spark.apache.org/docs/latest/running-on-kubernetes.html and the steps on examplel like these: https://towardsdatascience.com/how-to-build-spark-from-source-and-deploy-it-to-a-kubernetes-cluster-in-60-minutes-225829b744f9 and I can run the spark-pi example.
I even recreated the spark image and have it contain my xxx.jar in both /opt/spark/examples/jars and /opt/spark/jars but still I get the failed to load class issue. Any ideas what I may have missed. This is extra baffling to me because I checked that the jar is part of the image right next to the example jars, and those work fine.
I run spark-submit like this.
bin/spark-submit
--master k8s://https://localhost:6443
--deploy-mode cluster
--conf spark.executor.instances=3
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark
--conf spark.kubernetes.container.image=spark2:latest
--conf spark.kubernetes.container.image.pullPolicy=Never
--class com.xxx.Application
--name myApp
local:///opt/spark/examples/jars/xxx.jarUpdate: Added stacktrace:
spark.driver.bindAddress=10.1.0.68 --deploy-mode client --properties- file /opt/spark/conf/spark.properties --class com.xxx.Application local:///opt/spark/examples/jars/xxx.jar
20/05/13 11:02:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Error: Failed to load class com.xxx.Application.
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).Thanks!
Verify the jar which you referring has Application.class file at com.xxx.Application.class using jar xf <jar name>.jar Replace
local:///opt/spark/examples/jars/xxx.jarwith
/opt/spark/examples/jars/xxx.jarin your spark-submit command