I am running a kubernetes job on GKE and want to delete the job automatically after the job is completed.
Here is my configuration file for the job. I set ttlSecondsAfterFinished: 0
but the job was not deleted automatically. Am I missing something?
apiVersion: batch/v1
kind: Job
metadata:
name: myjob
spec:
# automatically clean up finished job
ttlSecondsAfterFinished: 0
template:
metadata:
name: myjob
spec:
containers:
- name: myjob
image: gcr.io/GCP_PROJECT/myimage:COMMIT_SHA
command: ["bash"]
args: ["deploy.sh"]
# Do not restart containers after they exit
restartPolicy: Never
It depends how did you create job.
If you are using CronJob you can use spec.successfulJobsHistoryLimit
and spec.failedJobsHistoryLimit
and set values to 0. It will say K8s to not sotre any previously finished jobs.
If you are creating pods using YAMLs you have to deleted it manually. However you can also set CronJob to execute command each 5 minutes.
kubectl delete job $(kubectl get job -o=jsonpath='{.items[?(@.status.succeeded==1)].metadata.name}')
It will delete all jobs with status succeded.
Looks like this feature is still not available on GKE now.
https://kubernetes.io/docs/reference/command-line-tools-reference/feature-gates/ https://cloud.google.com/kubernetes-engine/docs/concepts/alpha-clusters#about_feature_stages
To ensure stability and production quality, normal GKE clusters only enable features that
are beta or higher. Alpha features are not enabled on normal clusters because they are not
production-ready or upgradeable.