I've created a GKE cluster with Terraform and I also want to manage Kubernetes with Terraform as well. However, I don't know how to pass GKE's credentials to the kubernetes
provider.
I followed the example in the google_client_config
data source documentation and I got
data.google_container_cluster.cluster.endpoint is null
Here is my failed attempt https://github.com/varshard/gke-cluster-terraform/tree/title-terraform
cluster.tf
is responsible for creating a GKE cluster, which work fine.
kubernetes.tf
is responsible for managing Kubernetes, which failed to get GKE credential.
You don't need the google_container_cluster
data source here at all because the relevant information is also in the google_container_cluster
resource that you are creating in the same context.
Data sources are for accessing data about a resource that is created either entirely outside of Terraform or in a different Terraform context (eg different state file and different directory that is terraform apply
'd).
I'm not sure how you're in your current state where the data source is selecting an existing container cluster and then you define a resource to create that container cluster using the outputs of the data source but this is way overcomplicated and slightly broken - if you destroyed everything and reapplied it wouldn't work as is.
Instead you should remove the google_container_cluster
data source and amend your google_container_cluster
resource to instead be:
resource "google_container_cluster" "cluster" {
name = "${var.project}-cluster"
location = "${var.region}"
# ...
}
And then refer to this resource in your kubernetes
provider:
provider "kubernetes" {
load_config_file = false
host = "https://${google_container_cluster.cluster.endpoint}"
cluster_ca_certificate = "${base64decode(google_container_cluster.cluster.master_auth.0.cluster_ca_certificate)}"
token = "${data.google_client_config.current.access_token}"
}