How could I use the spark-operator-k8s to listen to a Q in RabbitMQ and then trigger spark jobs with the amqp message as job arguments?

3/23/2020

am having a spark-operator-k8s which am able to run using kubectl commands. But now I want to use that as a consumer for rabbitmq. Also suggestions are welcome for this approach and I would like to understand if there are better ways to do achieve this.

-- Sathish
apache-spark
kubernetes

1 Answer

4/1/2020

For those who are planning to comment; how I solved the above!

We use spark-k8s-operator to manage the spark applications in k8s and turns out spark-k8s-operator was not designed to listen any amqp messages and so in order to trigger spark-k8s-operators to trigger a spark-submit command, from within the celery worker pod we generate the manifest for the sparkApplication on the dynamically and then call the kubernetes REST API with the manifest.

In order to manually trigger the same using kubectl one could write their the sparkApplication manifest and run kubectl apply -f <manifest.yml>

-- Sathish
Source: StackOverflow