Associate Elastic IPs with EKS worker nodes?

1/15/2019

I've just successfully followed AWS EKS Getting Started Guide, and now I have operational Kubernetes cluster of 3 worker nodes.

Worker node EC2 instances have auto-assigned public IPs:

IPv4 Public IP: 18.197.201.199
Private IPs: 192.168.180.57, 192.168.148.90
Secondary private IPs: 192.168.170.137, 192.168.180.185, 192.168.161.170, 192.168.133.109, 192.168.182.189, 192.168.189.234, 192.168.166.204, 192.168.156.144, 192.168.133.148, 192.168.179.151

In order to connect to private off-AWS resources, firewall rules require node public IPs be from specific pool of Elastic IPs. (More specifically, worker nodes must access private Docker registry behind the corporate firewall, which white-lists several AWS Elastic IPs.) The simplest seems to override auto-assigned public node IPs with pre-defined Elastic IPs; however AWS allows to associate Elastic IP only with a specific private IP.

How do I proceed to replace auto-assigned public IPs with Elastic IPs?

-- Andrey Paramonov
amazon-eks
amazon-web-services
kubernetes

1 Answer

1/25/2019

Remember that nodes can come and go.

You wouldn't want a specific node in your cluster configured to an Elastic IP that was cleared for your off-AWS resource(s).

Instead you would have a NAT Gateway assigned an Elastic IP and cluster node(s) in a private subnet that use that NAT Gateway for outbound communication.

This configuration is described beginning on page 85 of this pdf. https://docs.aws.amazon.com/eks/latest/userguide/eks-ug.pdf

-- Bradley
Source: StackOverflow