I'm running a java spark app in cluster mode in kubernetes. At the moment I'm using the SparkLauncher's java API to add a listener to get notifications about the lifecycle from the running app. As far as I understand the listener way is not fully implemented when running in k8s. the code i'm using closes the spark context on the driver pod. this results is state LOST in the java sparkapphandle.listener.
So my question is, which technically way should I use to get notified about the state of the app I submitted. At the moment it is enough to know the exit code of the app, not the progress or the state of every executor.
I see the following tools
I mean I see logs which contains infos i'm interested in. I believe it is the class LoggingPodStatusWatcher.scala which logs these infos.
So can someone help me to point me to the smartest way to gather the exit code programmatically from the app I started
thx marko
My latest findings how to answer this questions, is to use the fabric8io library by using the wach api.
examples can be found here. https://github.com/fabric8io/kubernetes-client/blob/master/kubernetes-tests/src/test/java/io/fabric8/kubernetes/client/mock/PodTest.java