Kubernetes - Connecting to a remote database server via ssh

10/26/2018

We're in the process of migrating our web app to a distributed application model using gcloud/docker/K8. We've got a remote database server that is configured to only accept connections from another remote server i.e. to get to this database you have to first ssh to one server using username and password then connect to the db via MySQL Host name with the usual user and password. Been hitting the google hard to try to find out how we might configure our K8 pods to make this connection and it seems as if there are many different approaches that might work but no documented "sure fire" way yet. Our microservices are written with Lumen and are able to successfully connect to our dev db which is also remote but is not ssh. What might be our best approach? Try to configure the Dockerfile to have the pods ssh out? Or should we try to connect a K8 service to the db and have the pods connect to that?

-- lola_the_coding_girl
devops-services
docker
kubernetes
lumen
ssh

1 Answer

10/27/2018

You have three options:

  1. Re-configure your networking layer to permit remote access from you Kubernetes Nodes "egress address(es)". Also known as "punching a hole in the firewall" -- which is likely not an option, but if it is, it is the easiest solution.
  2. Establish a "tunneled" connection between your node(s) and the database server using an ssh tunnel -- not highly reliable, susceptible to network connectivity & recoverability issues.
  3. Deploy openvpn on the database server (or a node within the same subnet) and run a vpn client (can be done with openvpn as well) within the nodes POD subnet -- realiable, secure, bit of work but doable and sustainable. See https://github.com/mateothegreat/k8-byexamples-openvpn for a complete end-to-end example with documentation.
-- yomateo
Source: StackOverflow