GCP kubernetes cluster node error NetworkUnavailable

11/19/2019

I am trying to set up kubernetes cluster on GCP with terraform.Terraform script has VPC (firewall, subnet default route)and kubernetes.

Randomly i am getting issue "NetworkUnavailable" inside cluster node, but same terraform script work fine in next run.

So there is no any issue with terraform. I dont know how to resolve this issue. If i running script 10 times then provision get fail 4 to 5 times.

Error waiting for creating GKE NodePool: All cluster resources were brought up, but the cluster API is reporting that: only 3 nodes out of 4 have registered; cluster may be unhealthy.

Please help me.

enter image description here

Thanks Shrwan

-- user2254560
google-cloud-platform
google-kubernetes-engine
kubernetes
terraform

1 Answer

11/19/2019

This is a fairly common issue when using terraform to create GKE clusters. If you create the cluster manually through the GKE API, you won't have the same error.

Note that when creating a GKE cluster, you only need to create the cluster. It is not necessary to create firewall rules or routes as the GKE API will create them during cluster creation.

Most of the time, this error message means that the nodes are unable to communicate with the master node, this is usually linked to an issue with the network config.

If you are creating a zonal cluster, you might have this issue. I'll add this 3rd one as well, which has a 3rd root cause for the same issue.

-- Patrick W
Source: StackOverflow