how to setup persistent logging and dags for airflow running as kubernets pod

6/14/2021

I working local windows machine with unbuntu wsl installation.

$cat pv.yml:
apiVersion: v1
kind: PersistentVolume
metadata:
  name: airflow-volume
  labels:
    type: local
spec:
  #we use local node storage here!
  #kubectl get storageclass
  storageClassName: standard
  capacity:
    storage: 1Gi
  accessModes:
    - ReadWriteMany
  hostPath:
    path: "/mnt/c/dumps"


$ cat pvc.yml
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  name: kube-claim
spec:
  storageClassName: standard
  accessModes:
    - ReadWriteMany
  resources:
    requests:
      storage: 50Mi

helm command:

helm upgrade $RELEASE_NAME apache-airflow/airflow --namespace $NAMESPACE     --set images.airflow.repository=airflow-dags     --set images.airflow.tag=0.2.3 --set executor=KubernetesExecutor --set logs.persistence.enabled=true --set logs.persistence.existingClaim=kube-claim

error on logs screen from webserver:

*** Trying to get logs (last 100 lines) from worker pod batchtestpodruntest.1f82d2f60f9f4766b81ed54abb28e75a ***

*** Unable to fetch logs from worker pod batchtestpodruntest.1f82d2f60f9f4766b81ed54abb28e75a ***
(403)
Reason: Forbidden
HTTP response headers: HTTPHeaderDict({'Content-Type': 'application/json', 'X-Content-Type-Options': 'nosniff', 'Date': 'Mon,

14 Jun 2021 15:21:19 GMT', 'Content-Length': '425'}) HTTP response body: b'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \"batchtestpodruntest.1f82d2f60f9f4766b81ed54abb28e75a\" is forbidden: User \"system:serviceaccount:airflow:latest-airflow-webserver\" cannot get resource \"pods/log\" in API group \"\" in the namespace \"airflow\"","reason":"Forbidden","details":{"name":"batchtestpodruntest.1f82d2f60f9f4766b81ed54abb28e75a","kind":"pods"},"code":403}\n'

-- Programmer007
airflow
kubernetes
kubernetes-pod

1 Answer

6/15/2021

You could configure airflow to send logs to s3 or elastic search.

-- floating_hammer
Source: StackOverflow