Loadbalancing python webserver in kubernetes pods

5/5/2020

I was going through Docker and Kubernetes . I want to create two Python web servers and need to access them using public URL and these requests should be balanced between two servers.

I created one Python server and initially deployed that with Docker containers and all this I'm doing using AWS ec2 instance so when I tried to send a request I used ec2publicip:port. This is working which means I created one web server and similarly I will do the same for the second server.

My question is If I deploy this with Kubernetes - Is there any way to do load balancing the Python web servers within the pod. If so, can someone tell me how to do this?

-- HARINI NATHAN
docker
kubernetes
python-3.x

1 Answer

5/5/2020

If you create two replicas of the pod via a kubernetes deployment and create a service of type LoadBalancer an ELB on AWS is automatically provisioned.Then whenever a request comes to the ELB on AWS it will distribute the traffic to the replicas of the pod. With a loadbalancer type service you get advanced load balancing capabilities at layer 7. Without a loadbalancer type service or an ingress you get round robin load balancing at layer 4 offered by kube proxy.

Problem with loadbalancer type service is that it will create new ELB for each service which is costly. So I recommend using ingress controller such as Nginx and expose the Nginx Ingress controller via a single loadbalancer on AWS. Then create ingress resource and use path or host based routing to send traffic to pods behind a clusterIP type service.

-- Arghya Sadhu
Source: StackOverflow