Aceessing K8s secrets from any Airflow task

7/27/2021

We have an Airflow (Celery executor) setup that can run tasks on our K8s cluster. The tasks that use KubernetesPodOperator can access K8s secrets as described in the documentation. The rest of the tasks run on Celery workers outside of the K8s cluster.

How can tasks using other operators (e.g., SqlSensor) access the same K8s secrets as the tasks using KubernetesPodOperator?

-- SergiyKolesnikov
airflow
airflow-2.x
kubernetes
mwaa

2 Answers

7/27/2021

You can map the secrets as volumes ot variables into your Worker Pods and they will be available for all tasks - either as specific directory or as environment variables.

You just have to modify the Helm Chart (or whatever deployment you use) to use those.

-- Jarek Potiuk
Source: StackOverflow

8/9/2021

If you need to pass secrets between MWAA and K8s I would suggest using an external secrets manager.

AWS Secrets Manager can be used natively by MWAA and K8S. https://docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html https://docs.aws.amazon.com/AmazonECS/latest/developerguide/specifying-sensitive-data-secrets.html

Hashicorp Vault is another option.

One thing to note is to not pass Secrets as variables to KubernetesPodOperator


Our solution was to actually run MWAA tasks on AWS Fargate using ECS operator

https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/operators/ecs.html

Secrets are shared per ECS task definition and used as environmental variables. Developers have a simple YAML configuration file for defining ECS tasks and associated secrets from AWS Secrets Manager and Terraform handles managing Task Definitions based on this YAML. It works really smooth for our particular use case.

-- Jacek Sztandera
Source: StackOverflow