Connecting to Kubernetes cluster on AWS internal network

9/24/2020

I have two Kubernetes clusters in AWS, each in it's own VPC.

  • Cluster1 in VPC1
  • Cluster2 in VPC2

enter image description here

I want to do http(s) requests from cluster1 into cluster2 through a VPC peering. The VPC peering is setup and I can ping hosts from Cluster1 to hosts in Cluster2 currently.

How can I create a service that I can connect to from Cluster1 in Cluster2. I have experience setting up services using external ELBs and the like, but not for traffic internally in this above scenario.

-- silverdagger
amazon-web-services
kubernetes
networking

2 Answers

9/24/2020

You can create internal LoadBalancer.

All you need to do is to create a regular service of type LoadBalancer and annotate it with the following annotation:

service.beta.kubernetes.io/aws-load-balancer-internal: "true"
-- Matt
Source: StackOverflow

9/24/2020

Use an internal loadbalancer.

apiVersion: v1
kind: Service
metadata:
    name: cluster2-service
    namespace: test
annotations:
    service.beta.kubernetes.io/aws-load-balancer-internal: "true"

That will instruct the CNI to allocate the elb on a private subnet, which should make services behind it in the cluster reachable from the other vpc.

-- mcfinnigan
Source: StackOverflow