Using AWS, how to ssh to k8s nodes

11/9/2015

The quickstart mentions a few times that, "You should be able to ssh into any node in your cluster ..." (e.g., http://kubernetes.io/v1.0/docs/user-guide/connecting-applications.html#environment-variables). I have tried as described below but I am getting timed out.

  1. I used export KUBERNETES_PROVIDER=aws; curl -sS https://get.k8s.io | bash to start the cluster
  2. I have only specified AWS_REGION in my environment
  3. The nodes are residing in VPC and I am able to ping them from a bastion

This is the result: ubuntu@ip-10-128-1-26:~$ ssh core@170.20.0.248 -v OpenSSH_6.6.1, OpenSSL 1.0.1f 6 Jan 2014 debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 19: Applying options for * debug1: Connecting to 170.20.0.248 [170.20.0.248] port 22. debug1: connect to address 170.20.0.248 port 22: Connection timed out ssh: connect to host 170.20.0.248 port 22: Connection timed out ubuntu@ip-10-128-1-26:~$

Any idea or pointers would be appreciated. Thank you.

-- skwokie
amazon-web-services
kubernetes
ssh

1 Answer

11/10/2015

It looks like your problem is with making sure the corresponding security group is open to ssh from whichever nodes you'd like to connect from. Make sure it's open to the public IP or the private IP, depending on which you're connecting from. For the right ssh key to use: it'll be whichever one you setup when spinning up the nodes. You can check that in the EC2 pane of AWS in the "key pairs" side bar option:

AWS key pairs image

-- Eli
Source: StackOverflow