Not able to SSH EKS Worker Nodes

1/28/2019

I have create EKS cluster as specified in https://docs.aws.amazon.com/eks/latest/userguide/getting-started.html

Added worker nodes as specified in above link Step 3: Launch and Configure Amazon EKS Worker Nodes

In security Group also I added rule for enabling ssh to worker nodes. When I tried to login to worker node with 'ec2-user' username and with valid key SSH Login is not happening.

Can anyone help me in debugging this issue ?

-- Karthik
amazon-ec2
amazon-eks
amazon-web-services
aws-eks
kubernetes

3 Answers

1/29/2019

I found a workaround. I created an EC2 instance with same VPC which is used by worker node, also used the same security group and Key Pair for newly created EC2 instance. I tried to login to newly created EC2 instance which works like charm ( don't know Why it won't work for worker nodes). Once I logged into the instance tried SSH to worker nodes from there with Private IP which is working as expected.

Again this a workaround. Not sure why I wasn't able to login to worker node.

-- Karthik
Source: StackOverflow

1/28/2019

I think you are missing SSH rule for instance's security group or you are using the wrong SSH key to connect to the worker nodes.

Please check from the console your security group id, and add SSH rule from inbound rule like in the screenshot if you don't have it. SSH rule for security group of worker nodes

Or you can add same rule via aws cli like:

aws ec2 authorize-security-group-ingress --group-id <security-group-id>  --protocol tcp --port 22 --cidr 0.0.0.0/0

Then, by specifying a valid SSH key, you can run the below command to connect to your worker node.

ssh -i "ssh-key.pem" ec2-user@<node-external-ip or node-dns-name>

If you lost/miss your key, you need to create new stack in cloudformation with new SSH key-pair as described in the following tutorials.

Creating a Key Pair Using Amazon EC2 and Launch and Configure Amazon EKS Worker Nodes

I hope it will help you.

-- coolinuxoid
Source: StackOverflow

1/28/2019

As far as I remember by default AWS EKS worker nodes don't have public IPs. You can try to create an EC2 instance with a public IP in the same subnet as your worker nodes are, and than use it like a bastion host to ssh into worker nodes.

-- Artem Timchenko
Source: StackOverflow