How to add configmap to a Job

7/1/2021

I need to deploy an application as a Job in an k8s environment.

My job.yml

apiVersion: batch/v1
kind: Job
metadata:
  name: hello-stackoverlow
  labels:
    app: hello-stackoverlow
    group: hello-stackoverflow
    version: 1.0.0
spec:
  parallelism: 1
  completions: 1
  template:
    spec:
      containers:
        - name: hello-stackoverflow
          image: docker-registry:1234/hello-stackoverflow:1.0.0
          volumeMounts:
            - name: config
              mountPath: /job/config
      restartPolicy: OnFailure
      volumes:
        - name: config
          configMap:
            name: hello-stackoverflow
            items:
              - key: application.properties
                path: application.properties
              

My configmap.yml

apiVersion: v1
kind: ConfigMap
metadata:
  name: hello-stackoverflow
data:
  application.properties: |
    # spring application properties file
    logging.file=test.log
    logging.level.com.db=DEBUG

When trying to deploy the Job I get the following error:

*Failure executing: POST at: {URL} Message: Job.batch "hello-stackoverflow" is invalid: 
[spec.template.spec.containers[1].name: Required value, spec.template.spec.containers[1].image: Required value]. 
Received status: Status(apiVersion=v1, code=422, details=StatusDetails(causes=[StatusCause(field=spec.template.spec.containers[1].nam
e, message=Required value, reason=FieldValueRequired, additionalProperties={}), StatusCause(field=spec.template.spec.containers[1].image,
message=Required value, reason=FieldValueRequired, additionalProperties={})]*

My question is: For attaching a configmap to a Job in k8s, you also need a deployment.yml or is it possible just do it within your job.yml?

-- Opri
kubernetes
kubernetes-deployment
kubernetes-jobs
openshift

0 Answers