AWS EKS - cluster no nodes availabe to schedule pods

12/26/2019

I am new to AWS-EKS and am trying to create cluster and attack worker nodes and i have followed AWS documentation https://docs.aws.amazon.com/eks/latest/userguide/getting-started-console.html and using cloudformation for creating VPC with both private and public subnets and launched cluster using both private and public subnets then configured kubectl,AWS CLI and Installing aws-iam-authenticator as guided in documentation and Launched Amazon EKS Linux Worker Nodes using cloudformation yaml format and ran all commands to enable worker nodes to join your cluster but when i run kubectl get nodes i get result as No resource found.

i have run below commands to troubleshoot:

kubectl get pods following is the response

kubectl get events following is the response

i have installed kubectl on linux instances and worker nodes as well in linux instances. Please help me to fix this issue.

-- heena r
amazon-web-services
aws-eks
kubernetes
kubernetes-pod

1 Answer

1/6/2020

On top of my head 3 things can cause this:

  1. aws-auth might not be setup properly. Make sure kubectl get cm -n kube-system aws-auth -oyaml exists and contains the right rolearn given by the cloudformation output (node group role arn)
  2. The ControlPlane security group might not allow the subnets of your nodes.
  3. The ClusterName does not match the actual EKS ClusterName
-- Sartigan
Source: StackOverflow