Fail to connect the GKE with GCE on the same VPC?

9/9/2019

I am new to Google Cloud Platform and the following context:

I have a Compute Engine VM running as a MongoDB server and a Compute Engine VM running as a NodeJS server already with Docker. Then the NodeJS application connects to Mongo via the default VPC internal IP. Now, I'm trying to migrate the NodeJS application to Google Kubernetes Engine, but I can't connect to the MongoDB server when I deploy the NodeJS application Docker image to the cluster.

All services like GCE and GKE are in the same region (us-east-1).

I did a hard test accessing a kubernetes cluster node via SSH and deploying a simple MongoDB Docker image and trying to connect to the remote MongoDB server via command line, but the problem is the same, time out when trying to connect.

I have also checked the firewall settings on GCP as well as the bindIp setting on the MongoDB server and it has no blocking on that.

Does anyone know what may be happening? Thank you very much.

-- Felipe Antero
google-cloud-platform
google-compute-engine
google-kubernetes-engine
kubernetes
vpc

1 Answer

9/9/2019

By default, containers in a GKE cluster should be able to access GCE VMs of the same VPC through internal IPs. It is just like you access the internet (e.g., google.com) from GKE containers, GKE and VPC know how to route the traffic. The problem must be with other configurations (firewall or your application).

You can do a test, start a simple HTTP server in the GCE VM, say the internal IP is 10.138.0.5:

python -m SimpleHTTPServer 8080

then create a GKE container and try to access the service:

kubectl run my-client -it --image=tutum/curl --generator=run-pod/v1 -- curl http://10.138.0.5:8080
-- Dagang
Source: StackOverflow