I've been using the kubectl to upload Airflow workflows to the kubernetes (/usr/local/airflow/dags) manually. It is possible to do this without using the kubectl? by using a python script? or something else? If it's possible would you be able to share your source? or your python script? Thanks, Appreciate
This totally depends on your setup. E.G. We use AWS and so we have the DAGs syncing from an S3 bucket path every 5 minutes. We just put dags into S3. I see that some Kubernetes setups use a kind of shared volume defined by a git repository, that might also work. Airflow itself (the webserver(s), worker(s), nor scheduler) does not offer any hook to upload into the DAG directory.