Need to change the values of AIRFLOWLOGGINGREMOTE_BASE_LOG_FOLDER & AIRFLOWKUBERNETESWORKER_CONTAINER_REPOSITORY via CI-CD variable. The above values are present in airflow_template.yaml file. I tried substituting the CI-CD variables, but it is not working. If there is a better way to parameterize. please let me know.
<!-- begin snippet: js hide: false console: true babel: false --><!-- language: lang-html -->#My project folder structure looks like below: dataops -- docker -- base -- airflow.cfg -- airflow_template.yaml -- Dockerfile -- dag-image --Dockerfile -- helm --Chart.yaml --values.yaml --templates --deployment.yaml --svc.yaml
airflow_template.yaml apiVersion: v1 kind: Pod metadata: labels: {} spec: containers: - args: [] command: [] env: - name: AIRFLOWKUBERNETESWORKER_CONTAINER_REPOSITORY value: $DEV_AIRFLOW_CONTAINER_REPO - name: AIRFLOWLOGGINGREMOTE_BASE_LOG_FOLDER value: $DEV_AIRFLOW_LOG_FOLDER envFrom: [] imagePullPolicy: Always name: base ports: [] volumeMounts: - mountPath: /usr/local/airflow/logs name: airflow-logs hostNetwork: false imagePullSecrets: [] initContainers: [] nodeSelector: {} restartPolicy: Never securityContext: runAsUser: 1000 serviceAccountName: default volumes: - emptyDir: {} name: airflow-logs
gitlab-ci.yml stages:
build_and_upload: stage: build_and_upload image: docker:latest variables: DOCKER_DRIVER: overlay2 DOCKER_TLS_CERTDIR: "/certs" services: - docker:19.03.14-dind script: - echo $DEV_CREDENTIALS > service_account.json && cat service_account.json | docker login -u _json_key --password-stdin https://gcr.io - echo "as- $DEV_AIRFLOW_LOG_FOLDER" - export DEV_AIRFLOW_LOG_FOLDER="${DEV_AIRFLOW_LOG_FOLDER}" - mkdir -p edfi/operation - cp -r airflow_dags/ dataops/docker/dag-image/airflow_dags/ - cd dataops/docker/dag-image/ - docker build -t "$DEV_DAGS_IMAGE:$CI_COMMIT_SHORT_SHA" --build-arg COMMIT_HASH=$CI_COMMIT_SHORT_SHA . - docker tag $DEV_DAGS_IMAGE:$CI_COMMIT_SHORT_SHA $DEV_DAGS_IMAGE:latest - docker push $DEV_DAGS_IMAGE:$CI_COMMIT_SHORT_SHA - docker push $DEV_DAGS_IMAGE:latest only: refs: - develop
deploy_to_dev: stage: deploy_to_dev image: $CI_REGISTRY_IMAGE:kube-image script: - echo $DEV_CREDENTIALS > service_account.json && cat service_account.json | docker login -u _json_key --password-stdin https://gcr.io - echo "as- $DEV_AIRFLOW_LOG_FOLDER" - export DEV_AIRFLOW_CONTAINER_REPO="${DEV_AIRFLOW_CONTAINER_REPO}" - export DEV_AIRFLOW_LOG_FOLDER="${DEV_AIRFLOW_LOG_FOLDER}" - gcloud auth activate-service-account $DEV_SERVICE_ACCOUNT --key-file=./service_account.json --project=$DEV_PROJECT_NAME - gcloud container clusters get-credentials $DEV_GKE_CLUSTER --region $REGION - echo $DEV_DB_CONN > dataops/helm/airflow-loadbalancer/files/secrets/airflow/AIRFLOWCORESQL_ALCHEMY_CONN - cd dataops/helm/ - helm upgrade airflow-dev airflow-loadbalancer/ --install --atomic --set dags_image.tag=$CI_COMMIT_SHORT_SHA only: refs: - develop
<!-- end snippet -->You could make it a jinja2 template and use a small Python program to interpolate the values into the template. Then you also have all the flexibility to use environment variables or something else.