Can someone give me some guidance on best practice to bring multiple Talend jobs dynamically into Kubernetes?
I ended up not using Job2Docker in favour of a simple docker process.
Dockerfile
FROM java
WORKDIR /talend
COPY ./jobs /talend
Create a CronJob type for K8s
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: etl-edw-cronjob
spec:
schedule: "0 * * * *"
jobTemplate:
spec:
template:
spec:
restartPolicy: Never
containers:
- name: etl-edw-job
image: dockerhubrepo/your-etl
command: ["sh", "./process_data_warehouse_0.1/process_data_warehouse/process_data_warehouse_run.sh"]
env:
- name: PGHOST
value: postgres-cluster-ip-service
- name: PGPORT
value: "5432"
- name: PGDATABASE
value: infohub
- name: PGUSER
value: postgres
- name: PGPASSWORD
valueFrom:
secretKeyRef:
name: pgpassword
key: PGPASSWORD
- name: MONGOSERVER
value: mongo-service
- name: MONGOPORT
value: "27017"
- name: MONGODB
value: hearth