I have my data in github stored in JSON format. My pods clone this repo and use them and whenever an update is made to these data, a git hook is fired and the expectation is my pods to update with the recent data(by giving git pull). I have exposed this update service via load balancer and configured the same in githook.
Howver, when git hook fires, only one of the pods gets the request and does a git pull. Is there a way to notify all my pods under that service to update their local store?
So to achieve that I looked for some kind of shared storage which can be mounted in all the containers running in the Kubernetes cluster. eg .Google Cloud File Store equivalent to AWS EFS. So whenever there is a new commit in Github, the load balancer will ask one of the container to update the File Store. Since this is the same file store which is mounted in all the containers, they all will serve the latest data.
But, 1. Cloud File Store is still in Beta not in GA.
How does one solve this problem in a kuberentes environment?
There are 2 ways you can try: 1. every pod (via cron job) tries to pull data out of central storage, say once a minute and update is working directory when updates available. 2. the central server pushes updates to pods individually (load balancing here is not appropriate).
You can also think of implementing that via Deployments. As mentioned in another answer,
NFS can be useful in your sharing purpose.
If you are asking for a way to setup a common volume in kubernetes with multiple pods to read, You can setup a NFS pod like explained at This Official Example
I use this for my Jenkins setup in kubernetes and it does the job good.