When I try to run this task, I get the following error:
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow import DAG
from datetime import datetime, timedelta
default_args = {
"owner": "airflow",
"depends_on_past": False,
"start_date": datetime(2015, 6, 1),
"email": ["airflow@airflow.com"],
"email_on_failure": False,
"email_on_retry": False,
"retries": 1,
"retry_delay": timedelta(minutes=5),
}
dag = DAG("kubernetes", default_args=default_args, schedule_interval=None)
k = KubernetesPodOperator(
namespace='kubernetes',
image="ubuntu:16.04",
cmds=["bash", "-cx"],
arguments=["echo", "10", "echo pwd"],
labels={"foo": "bar"},
name="airflow-test-pod",
is_delete_pod_operator=True,
in_cluster=True,
task_id="task-two",
get_logs=True,
dag=dag)
Error:
File "/usr/local/lib/python3.7/site-packages/kubernetes/config/kube_config.py", line 491, in safe_get
key in self.value):
TypeError: argument of type 'NoneType' is not iterable
What am I doing wrong? I'm using puckel/airflow and the correct dependencies. <https://github.com/puckel/docker-airflow>
I need to edit something in airflow.cfg? I don't know where to search for this.
It seems, you don't have config_file parameter set, so KubernetesPodOperator fall back to its default value, which probably doesn't exist as well.
My suggestion would be to add "config_file=/path/to/kube_config.yaml". In the following file you also provide your credentials/tokens.