I am running an GKE Airflow Operator from market place and I have not idea as how to store my dags in google cloud bucket while deploying them in containers.
I tried the following, but it's not reflecting on the bucket
kubectl apply -f - <<EOF
apiVersion: airflow.k8s.io/v1alpha1
kind: AirflowCluster
metadata:
name: airflow-cluster
spec:
executor: Celery
config:
airflow:
AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL: 30 # default is 300s
redis:
operator: False
scheduler:
version: "1.10.0rc2"
ui:
replicas: 1
worker:
replicas: 2
dags:
subdir: "/dags"
gcs:
bucket: "gs://airflowdags"
airflowbase:
name: airflow-base
EOF