How to configure rabbitmq with spark-k8s-operator

4/4/2020

I have a rabbitmq from which each amqp message is consumed by a celery worker and it triggers a spark operator application job using kube native REST APIs. Now the problem is all messages in rabbitmq is prefetched by the celery worker at once and put into spark k8 operator's internal worker Q. And am unable to track the jobs via the spark-operator. Instead I would like to understand if we could configure spark k8s operator with rabbitmq in a more better way?

-- Sathish
apache-spark
kubernetes
kubernetes-operator
pyspark
rabbitmq

0 Answers