Trying to connect to Google cloud storage (GCS) using python

7/10/2018

I've build the following script:

import boto
import sys
import gcs_oauth2_boto_plugin

def check_size_lzo(ds):

# URI scheme for Cloud Storage.

    CLIENT_ID = 'myclientid'
    CLIENT_SECRET = 'mysecret'

    GOOGLE_STORAGE = 'gs'

    dir_file= 'date_id={ds}/apollo_export_{ds}.lzo'.format(ds=ds)



    gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)
    uri = boto.storage_uri('my_bucket/data/apollo/prod/'+ dir_file, GOOGLE_STORAGE)
    key = uri.get_key()



    if key.size < 45379959:
        raise ValueError('umg lzo file is too small, investigate')
    else:
        print('umg lzo file is %sMB' % round((key.size/1e6),2))



if __name__ == "__main__":

     check_size_lzo(sys.argv[1])

It works fine locally but when I try and run on kubernetes cluster I get the following error:

boto.exception.GSResponseError: GSResponseError: 403 Access denied to 'gs://my_bucket/data/apollo/prod/date_id=20180628/apollo_export_20180628.lzo'

I have updated the .boto file on my cluster and added my oauth client id and secret but still having the same issue.

Would really appreciate help resolving this issue.

Many thanks!

-- D_usv
google-cloud-platform
google-cloud-storage
kubernetes
python

1 Answer

7/10/2018

If it works in one environment and fails in another, I assume that you're getting your auth from a .boto file (or possibly from the OAUTH2_CLIENT_ID environment variable), but your kubernetes instance is lacking such a file. That you got a 403 instead of a 401 says that your remote server is correctly authenticating as somebody, but that somebody is not authorized to access the object, so presumably you're making the call as a different user.

Unless you've changed something, I'm guessing that you're getting the default Kubernetes Engine auth, with means a service account associated with your project. That service account probably hasn't been granted read permission for your object, which is why you're getting a 403. Grant it read/write permission for your GCS resources, and that should solve the problem.

Also note that by default the default credentials aren't scoped to include GCS, so you'll need to add that as well and then restart the instance.

-- Brandon Yarbrough
Source: StackOverflow