Create Kubernetes job from Node.js Cloud Function

9/22/2019

I need to invoke Kubernetes job from Google Cloud Function but didn't find any suitable method in https://github.com/googleapis/nodejs-cloud-container

-- BT3
google-cloud-functions
google-cloud-platform
google-kubernetes-engine

1 Answer

9/24/2019

That client library is for managing your GKE cluster. It isn't meant to talk with the Kubernetes API.

For that there are the Kubernetes API client libraries.

There is one for nodejs but as I'm more fluent with Python I'm going to demonstrate below how I did it with the Python Kubernetes API client library through GCP Cloud Funtions to apply successfully your use case.

I've dealt with a slightly cumbersome part of this process.

You'd issue in a regular terminal (given proper authentication and authorizations are put in place) gcloud container clusters get-credentials and you'd be able to communicate with a cluster's Kubernetes API with kubectl.

I didn't found yet an equivalent method to get the credentials for the cluster using the GKE API Python client library, so I've uploaded the ~/.kube/config file to a Cloud Storage bucket so that we can download it from the Cloud Function and subsequently talk with the Kubernetes cluster API.

So, let's create a bucket and upload the kubeconfig file to it (note that this has far from ideal security implications depending on the IAM policy and ACLs for the bucket in question, but it works for testing purposes).

 MY_GCP_PROJECT=MY_GCP_PROJECT #edit accordingly
 gsutil mb gs://$MY_GCP_PROJECT-configs # set location if more convenient, with '-l US' for example  
 gsutil cp ~/.kube/config gs://$MY_GCP_PROJECT-configs/kubeconfig

Now for the Cloud Function.

requirements.txt

google-cloud-storage
kubernetes

main.py

import tempfile
import json

from kubernetes import client as kubernetes_client, config as kubernetes_config
from google.cloud import storage


BUCKET, BLOB = "MY_GCP_PROJECT", "kubeconfig" #TODO EDIT accordingly

storage_client = storage.Client()


def download_gcs_object(name_bucket, name_blob):
    bucket = storage_client.bucket(name_bucket)
    blob = bucket.blob(name_blob)
    tmp = tempfile.NamedTemporaryFile(delete=False) 
    blob.download_to_filename(tmp.name)
    return tmp.name


def load_kube_config():
    kubernetes_config.load_kube_config(config_file=download_gcs_object(BUCKET, BLOB))


def create_container_object_default():
    return kubernetes_client.V1Container(name="pi", image="perl", command=["perl", "-Mbignum=bpi", "-wle", "print bpi(1000)"]) 


def create_job_object(container):
    template = kubernetes_client.V1PodTemplateSpec(metadata=kubernetes_client.V1ObjectMeta(labels={"app": "pi"}), spec=kubernetes_client.V1PodSpec(restart_policy="Never", containers=[container]))
    return kubernetes_client.V1Job(api_version="batch/v1", kind="Job", metadata=kubernetes_client.V1ObjectMeta(name="pi"), spec=kubernetes_client.V1JobSpec(template=template, backoff_limit=4))


load_kube_config()

kubernetes_api = kubernetes_client.BatchV1Api()


def hello_world(request):
    job = create_job_object(create_container_object_default())
    kubernetes_api.create_namespaced_job(body=job,namespace="default")
    return json.dumps(job.to_dict())

This Job (as an example taken from here) consists of a pod with a container (perl image) that uses perl to compute the first 999 digits of the number pi, so just modify the container object (python object in this case) information accordingly along with other configurations for the Job itself.

The function to execute in the "Cloud Function" is hello_world.

-- fhenriques
Source: StackOverflow