Airflow will not record logs after queued

8/16/2021

I have deployed Airflow on Kubernetes using KubernetesExecutor. Everything seems to work completely fine until I try to reach for the logs. The UI does show me the logs, but actually the complete logs are not being written. For every task I execute, with as many print statements as possible, the logs will show only until the queued state, that is, I cannot see any of the logged messages. I have tried with common print statements as well as with logging import. Besides, I have tried to use different Helm charts and in all of them I get the same result. Reading the logs from the Kubernetes pod itself gives me the same result as airflow UI.

2021-08-16 21:47:44,062 {dagbag.py:448} INFO - Filling up the DagBag from /home/airflow/.local/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py Running <TaskInstance: example_bash_operator.runme_1 2021-08-16T21:47:15.407072+00:00 queued> on host examplebashoperatorrunme1.6ef611f7e3c143d2a7a1df5e91984a82

This is the complete log. The PODs terminate successfuly and everything seems to work, not for the logs. Could someone help me as I have been trying every parameter possible in the yaml file for days and am still stuck on this issue?

This has not to do with remote logging since I am not even being able to reach for the local logging.

-- user45147
airflow
kubernetes
kubernetesexecutor
logging
mlops

0 Answers