Local Docker connection to Kubernetes Cluster

6/28/2018

I want to connect a docker container running locally to a service running on a Kubernetes cluster. To do so I have exposed a service through reserving some static IP addresses.

I have also saved those IP addresses in local DNS, in the /etc/hosts/ file:

123.123.123.12 host1
456.456.456.45 host2 

I want to link my container to that such that all the traffic is routed to those addresses so that it can be processed by the cluster. I am using the link feature in the docker container but it isn't working.

I want to connect directly using IP? How should I do this?

-- kush
docker
kubernetes

1 Answer

6/28/2018

There's no difference doing this if the client is or isn't in Docker. However you have the service exposed from Kubernetes, you'd make the same connection to it from a process running on an external host or from a process running in a Docker container on that host.

Say, as in the example in the Kubernetes documentation, you're running a NodePort service that's accessible on port 31496 on every node in the cluster, and you're trying to connect to it from outside the cluster. Maybe as in the question 123.123.123.12 is some node in the cluster. A typical setup would be to get the location of the service from an environment variable (JavaScript process.env.THE_SERVICE_URL; Ruby ENV['THE_SERVICE_URL']; Python os.environ['THE_SERVICE_URL']; ...).

When you're developing, you could set that variable in your local shell:

export THE_SERVICE_URL=http://123.123.123.12:31496
cd here && ./kubernetes_client_script.py

When you go to deploy your application, you can set the same environment variable:

docker run -e THE_SERVICE_URL=http://123.123.123.12:31496 me:k8s-client
-- David Maze
Source: StackOverflow