How to ssh from k8s pod to other computers

7/12/2019

Edit: Before you down vote, please comment why you down vote. so I can improve next time, thank you.

I tried to ssh from pod in kubernetes to another VM in GCE, mainly because I want to use rsync between these two. At the moment, I use gcloud compute scp to copy file to local computer then kubectl cp.

I used kubectl exec to access the pod, setting up ssh with ssh-keygen then copy rsa_id.pub to designated VM to /home/user/.ssh/, but when I try ssh -v user@ip it just said error connection timed out. I tried setup gcloud inside pods and to be able to use gcloud compute ssh and I also tried gcloud compute config-ssh, the results are the same.

When I ssh with my own computer it works fine

I think firewall or network configuration is causing this problem but I'm not really sure how to fix it. Should I expose ssh port with k8s service LoadBalancer or should I edit my firewall rules in VPC network?

-- Phootip
google-cloud-platform
google-compute-engine
kubernetes
ssh

1 Answer

7/12/2019

Beginning with Kubernetes version 1.9.x, automatic firewall rules have changed such that workloads in your Kubernetes Engine cluster cannot communicate with other Compute Engine VMs that are on the same network, but outside the cluster. This change was made for security reasons.

You can find the solution HERE

First, find your cluster's network:

gcloud container clusters describe [CLUSTER_NAME] --format=get"(network)"

Then get the cluster's IPv4 CIDR used for the containers:

gcloud container clusters describe [CLUSTER_NAME] --format=get"(clusterIpv4Cidr)"

Finally create a firewall rule for the network, with the CIDR as the source range, and allow all protocols:

gcloud compute firewall-rules create "[CLUSTER_NAME]-to-all-vms-on-network" --network="[NETWORK]" --source-ranges="[CLUSTER_IPV4_CIDR]" --allow=tcp,udp,icmp,esp,ah,sctp
-- Ernesto U
Source: StackOverflow