I tried to create an EKS Kubernetes cluster, for example using this: https://github.com/terraform-providers/terraform-provider-aws/tree/master/examples/eks-getting-started
What I get is all pods are pending:
No nodes available to schedule pods
in kubectl po describe
and I get an empty list when I type kubectl get nodes
.
Changing AMI images or AWS region doesn't help.
What's wrong?
I was still having issue with the code where the worker nodes register only if eks was set to public subnets only.
module "eks" { subnets = ["${module.vpc.public_subnets}"] }
I did not want my workers to be in public subnets. I changed
module "eks" {
subnets = ["${module.vpc.private_subnets}"]
}
module "vpc" {
single_nat_gateway = false
}
kubectl get nodes
NAME STATUS ROLES AGE VERSION
ip-10-0-1-247.us-east-2.compute.internal Ready <none> 17m v1.11.5
ip-10-0-1-75.us-east-2.compute.internal Ready <none> 17m v1.11.5
ip-10-0-2-225.us-east-2.compute.internal Ready <none> 17m v1.11.5
ip-10-0-3-210.us-east-2.compute.internal Ready <none> 17m v1.11.5
As Matt mentioned https://github.com/terraform-aws-modules/terraform-aws-eks helped. More specifically I just used the example from https://github.com/terraform-aws-modules/terraform-aws-eks/tree/master/examples/eks_test_fixture
All the other examples or running EKS cluster I found on the Internet don't work.