Helm + Kubernetes upload large file ~30-80 MB to cluster and mount it to pods

11/25/2021

I have helm + kubernetes setup. I need to store large file ~30-80 MB in cluster and mount it to pods. How do I achieve this, so that I don't manually upload the file to every environment?

-- NanoGuy
kubernetes
kubernetes-helm
large-files

2 Answers

11/25/2021

One way to do this is to use a helm install+upgrade hook AND an init container.

  1. Set a helm install hook to create a kubernetes job that will download the file to the mounted volume.
  2. The init container on the pod will wait indefinitely until the download is complete.
-- galdin
Source: StackOverflow

11/25/2021

You can share common files using NFS. There are many ways to use NFS with K8s such as this one. If your cluster is managed by cloud provider such as AWS, you can consider EFS which is NFS compatible. NFS compatible solution on cloud platform is very common today. This way you never need to manually upload files to worker nodes. Your helm chart will focus on create the necessary PersistentVolumeClaim/PersistentVolume and volume mount to access the shared files.

-- gohm'c
Source: StackOverflow