Scheduler not queuing jobs

12/2/2017

I'm trying to test out Airflow on Kubernetes. The Scheduler, Worker, Queue, and Webserver are all on different deployments and I am using a Celery Executor to run my tasks.

Everything is working fine except for the fact that the Scheduler is not able to queue up jobs. Airflow is able to run my tasks fine when I manually execute it from the Web UI or CLI but I am trying to test the scheduler to make it work.

My configuration is almost the same as it is on a single server:

sql_alchemy_conn = postgresql+psycopg2://username:password@localhost/db
broker_url = amqp://user:password@$RABBITMQ_SERVICE_HOST:5672/vhost
celery_result_backend = amqp://user:password@$RABBITMQ_SERVICE_HOST:5672/vhost

I believe that with these configurations, I should be able to make it run but for some reason, only the workers are able to see the DAGs and their state, but not the scheduler, even though the scheduler is able to log their heartbeats just fine. Is there anything else I should debug or look at?

-- Minh Mai
airflow
airflow-scheduler
kubernetes
rabbitmq

1 Answer

12/4/2017

First, you use postgres as database for airflow, don't you? Do you deploy a pod and service for postgres? If it is the case, do you verify that in your config file you have :

sql_alchemy_conn = postgresql+psycopg2://username:password@serviceNamePostgres/db 

You can use this github. I used it 3 weeks ago for a first test and it worked pretty well. The entrypoint is useful to verify that rabbitMq and Postgres are well configured.

-- pcc
Source: StackOverflow