Gitlab Runner can't access Gitlab self-hosted instance

6/26/2018

We have a self-hosted Gitlab Enterprise installation on a Google Compute Engine instance. This instance is protected with a firewall so only our employees can access the server.

When we deploy a Kubernetes cluster (using Gitlab CI), the runners can not access GitLab and thus will not start the CI jobs.

I can manually add the the external IP-adress of the Google Kubernetes instance to our GitLab firewall (GCP Firewall allowing all protocols and ports for the selected IP ranges) and then it will work. But because we have a changing number of Kubernetes Instances (and also preemptieve instances) we have to do this manually everyday.

That is not an optimal situation. I already tried adding internal IP ranges (10.132.0.0/20, 10.0.0.0/8, 10.56.0.0/14) but that was not the solution. The runners still can't reach the gitlab server, without specifying the exact instance IP.

What am I missing?

-- Eragon666
firewall
gitlab-ci-runner
google-compute-engine
kubernetes

1 Answer

7/4/2018

GKE Nodes appear as VM instances in the GCE platform. They are managed by the master node and they can be deleted (by the kube-controller) if they are deemed unhealthy. Once deleted, they are then recreated. For this reason, the IP addresses are ephemeral. It will be quite challenging to use the external IP address of each VM instance since the IP addresses change all the time. This is not a feasible solution.

One workaround would be to make good use of a NAT Gateway. All outbound traffic from the GKE nodes will be routed to a specific VM instance that would act as a NAT Gateway. You would then have only 1 static IP address which is the external IP address of the NAT Gateway.

You would then have one single static IP address that you could add to the firewall rule.

-- Mahmoud Sharif
Source: StackOverflow