Hi i created a node app that creates a k8 job with kubernetes-client library. Then that job app responds back with a http call to my node app. to reduce network latency and timing, i have to put everything inside a cluster. Is this possible to create a job inside a deployment?
References: https://github.com/kubernetes-client
apiVersion: batch/v1
kind: Job
metadata:
name: job
spec:
ttlSecondsAfterFinished: 10
template:
spec:
containers:
- name: samplejob
image: gcr.io/hjgfjfhgffghfght
command: ["node", "index.js", '{api_url":"apienpoint"}']
resources:
limits:
memory: "128Mi"
cpu: "100m"
requests:
memory: "128Mi"
cpu: "100m"
restartPolicy: Never
backoffLimit: 1
You can build docker image for your code, like you can use
node server.js
to trigger your main application and place the same image in the Kubernetes job, keep that in some other directory and execute that using the
command
option in the yaml template of job/cronjob.
This will sort it for you
This is definitely possible. It all depends on how you create your node.js container image. Your application can run as a deployment within the cluster. Something will cause the application to trigger an event. That event will send an API call to the k8s master to request that a job be created.
You can use one of the client libraries to interface with the k8s master. The node.js library is maintained by the community.