Unable to upload a 1GB+ file from python storage google client

11/5/2017

Hi I am unable to upload a 'large' file 2GB to google storage form kubernetes using the google-cloud-storage~=1.6.0 client.

    client = storage.Client()
    self.bucket = client.get_bucket('test-bucket')
    blob = self.bucket.blob(remote_file)
    blob.upload_from_filename(local_file)

I gave the pod a lot of memory ( 4GB+ ) thinking that it might solve it , but the master kills it any way (OOMKilled), i guess i am missing some configuration in the upload process.

-- Roman
google-cloud-platform
google-cloud-storage
kubernetes
python

1 Answer

11/6/2017

The default behaviour of upload_from_filename is to read the whole file into memory in order to try uploading it all at once. To avoid this set blob.chunk_size before calling blob.upload_from_filename. A sensible value might be 1MiB (1024*1024).

I suspect this may be an new issue introduced by this commit.

-- David
Source: StackOverflow