Airflow Logs in Kubernetes deployment

7/1/2021

We currently have airflow deployed in Kubernetes using the Airflow 2.0 & we've also tried with the 2.1 helm chart. When we try to get logs for running DAGS, we get the following:

*** Trying to get logs (last 100 lines) from worker pod dagsentstartpipeline.sdfsdfsfsdfsfsfdsee343443f34f3 ***

Waiting for host: airflow-postgresql.airflow.svc.cluster.local. 5432
[2021-06-30 12:35:52,524] {plugins_manager.py:286} INFO - Loading 1 plugin(s) took 3.23 seconds
[2021-06-30 12:35:53,460] {dagbag.py:440} INFO - Filling up the DagBag from /usr/local/airflow/dags/test_dag.py
Running <TaskInstance: test_dag.start_pipeline 2021-06-30T12:35:40.924151+00:00 [queued]> on host test_dagstartpipeline.70e4739231494c24a3e368
Running <TaskInstance: test_dag.test.start_pipeline 2021-06-30T12:35:40.924151+00:00 [queued]> on host test_dagstartpipeline.70e4739231494c24a3e368

Our DAG fails and we get no indiciation of why it fails because we cannot see the logs. Is there any suggestions of how we can view these logs?

We have tried setting the PODS to not delete after running, and we get the same logs

Thanks,

-- adan11
airflow
kubernetes

0 Answers