How do I ssh to nodes in ACS Kubernetes cluster?

6/19/2017

I have created an ACS Kubernetes cluster following the instructions here: https://docs.microsoft.com/en-us/azure/container-service/container-service-kubernetes-walkthrough .

I see that master node has a public IP and I can ssh into the master node using azureuser. But regular nodes has no public IP and I don't see how I can ssh into regular nodes from master node.

How do I SSH into the regular nodes?

-- codefx
azure
azure-container-service
kubernetes

3 Answers

10/30/2018

Microsoft has released official docs at https://docs.microsoft.com/en-us/azure/aks/ssh. The idea is to SSH into an interactive POD session and use that as jump host to the agent node.

-- andig
Source: StackOverflow

7/10/2017

You can use one of the k8s masters as a "bastion host" and avoid copying the keys over. Eg:

# In ~/.ssh/config

Host agent1_private_ip agent2_private_ip ....
  IdentityFile ~/.ssh/<your_k8s_cluster_key>
  ProxyCommand ssh user@master_public_ip -W %h:%p

Now just ssh user@agent1_private_ip

See more here: http://blog.scottlowe.org/2015/11/21/using-ssh-bastion-host/


PS: Here's a quickie to retrieve your agent private ips, in /etc/hosts format:

kubectl get nodes -o json | jq -r '.items[].status.addresses[].address' | paste - -
-- Valer
Source: StackOverflow

6/19/2017

You could copy the private key to your master VM. Then you could use ssh -i <path>/id_rsa user@<agent private IP> to k8s agent VM.

Note: agent's user name and private key is same with master VM.

-- Shui shengbao
Source: StackOverflow