How can i use a secret object created from Google cloud JSON file into a service account? I have minikf
on the VM and kubeflow
installed. I am trying to make a container using Jupyter
notebook in minikf
Kubernetes cluster. The notebook has access to GCP using PodDefault but the kanico container started from notebook
automatically can't access GCP. The code in jupyter notebook for building the container is as follows:
IMAGE_NAME="mnist_training_kf_pipeline"
TAG="latest" # "v_$(date +%Y%m%d_%H%M%S)"
GCR_IMAGE="gcr.io/{PROJECT_ID}/{IMAGE_NAME}:{TAG}".format(
PROJECT_ID=PROJECT_ID,
IMAGE_NAME=IMAGE_NAME,
TAG=TAG
)
builder = kfp.containers._container_builder.ContainerBuilder(
gcs_staging=GCS_BUCKET + "/kfp_container_build_staging")
image_name = kfp.containers.build_image_from_working_dir(
image_name=GCR_IMAGE,
working_dir='./tmp/components/mnist_training/',
builder=builder
)
I get the error:
Error: error resolving source context: dialing: google: could not find default credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
Usage:
executor [flags]
The pod name starting with Kaniko gets created and fails because it can't access the google cloud storage:
The proof of Jupyter notebook is able to utilize my secret object "user-gcp-sa" is that the above code is preparing files on GCS: