Unable to see Task logs in Airflow Deployed via Kubernetes

7/1/2021

I am unable to view any of the logs I am logging in an instance of Airflow deployed on Kubernetes.

For e.g. I was using logging.info('Log Started') to log something and it never shows up in the UI. Even in a test Dag with just logs in it I couldn't see any logs.

Was currently using Airflow 2.0.2 release and installing it via its latest Helm chart using the Kubernetes executor.

I also had to set "AIRFLOW__KUBERNETES__DELETE_WORKER_PODS" to false in the helm file to see some of these logs since my workers were completing but I cant see anything I try to log in the Airflow UI.

What else do I need to set to see the logging in my tasks properly?

-- vkandvia
airflow
kubernetes
logging

1 Answer

7/9/2021

So this is really weird behaviour, I was using the astronomer chart and after setting these values in the Environment options in the values.yml file, I was able to view those logs.

# Environment variables for all airflow containers
# other than init containers.
env: 
  - name: "AIRFLOW__KUBERNETES__DELETE_WORKER_PODS"
    value: "false"
  - name: "AIRFLOW__WEBSERVER__LOG_FETCH_TIMEOUT_SEC"
    value: "15"
  - name: "AIRFLOW__LOGGING__LOGGING_LEVEL"
    value: "INFO"
  - name: "AIRFLOW__LOGGING__REMOTE_LOGGING"
    value: "True"
  - name: "AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID"
    value: "azure"
  - name: "AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER"
    value: "wasb_logs"

Weird because even though I did not enter a connection in the UI, I was able to see those logs. They might be stored somewhere temporarily I guess. After I specify a connection I am able to see logs in my Azure Blob storage after I trigger the second DAG (it picks up the connection after sometime I guess).

-- vkandvia
Source: StackOverflow