I'm looking for the proper way to access storage from a container, all running in google's cloud.
I have the following test script.py
:
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('my_test_bucket')
blob = bucket.blob('test_file.txt')
blob.upload_from_string('test content')
and a Docker image that simply runs it (python 3.7).
The bucket is private and thus needs authentication, and that is where my problem lies.
Other users say that this should happen transparently, as long as running in the correct project. However, this did not work for me (since that question was answered, the python library has been deprecated in favor of google-cloud-storage
, and Container Engine changed its name to Kubernetes Engine).
I managed to get it working by creating a service account and setting GOOGLE_APPLICATION_CREDENTIALS
to the path of the json credential, but this requires putting private credentials inside the (possibly public) Docker image.
What is the proper way to do this?