Connecting to external mysql db from nodejs GKE pod

2/28/2019

I have a nodejs express app that connects to a mysql DB using:

const dbconfig = {
 client: 'mysql',
 connection: {
     host: config.db.host,
     user: config.db.user,
     password: config.db.password,
     database: config.db.database,
     port: config.db.port,
     charset: 'utf8',
     ssl: {
         ca: fs.readFileSync(__dirname + '/root_ca.pem')
     }
 }
}

In my local docker env this connection is successful, however when deploying this onto a kube cluster I am unable to connect to host:port.

The VPC is set up to allow Ingress/Egress traffic on that host/port.

And a service and endpoint were setup as well:

kind: "Service"
apiVersion: "v1"
metadata:
 name: "mysql"
spec:
 ports:
  - name: "mysql"
    protocol: "TCP"
    port: 13306
    nodePort: 0
selector: {}

---

kind: "Endpoints"
apiVersion: "v1"
metadata:
 name: "mysql"
subsets:
 - addresses:
   - ip: "34.201.17.84"
   ports:
    - port: 13306
      name: "mysql"

Update: Still no luck but more info shows that the pod and the node are not able to reach the host.

-- Chaosjosh
google-kubernetes-engine
kubernetes
mysql
networking
node.js

1 Answer

3/14/2019

So with the help of google support I was able to find a solution to my problem. The issue was that the ip address that is whitelisted to connect to the database was not the ip address of the loadbalancer; as loadbalancers are for ingress traffic and not egress traffic.

The work around for this is to create a private cluster and then route the egress traffic of that cluster through a single ip (or ip range) using Google Cloud NAT service. Once that was done I was able to successfully connect to the DB without the need of the extra endpoints/mysql service.

-- Chaosjosh
Source: StackOverflow