I have created a GCP Cluster via logging into my GCP project > Container Engine > Create Cluster. But when I want to work with it via kubectl
, I can't seem to add it to my configuration.
Using the Google Cloud SDK, I have properly authenticated, set my project, default region/zone, can I can see my cluster:
gcloud auth application-default login
gcloud config set compute/region us-east1
gcloud config set compute/zone us-east1-b
gcloud config set project <myproject>
gcloud container clusters list
gcloud container clusters get-credentials <cluster name>
gcloud config set container/cluster <cluster name>
So all of this works fine.
I have a blank ~/.kube/config
as I wanted to troubleshoot a problem and start over, therefore, my kube config is:
kubectl config view
...
apiVersion: v1
clusters: []
contexts: []
current-context: ""
kind: Config
preferences: {}
users: []
My problem is that I'm following this guide (see "Configuring kubectl" section) and therefore I want to inform kubectl about my new cluster. So per the documentation there, I successfully run:
gcloud container clusters get-credentials NAME --zone ZONE
...
Fetching cluster endpoint and auth data.
kubeconfig entry generated for <clustername>.
Yet when I run kubectl config view
I still see everything blank:
apiVersion: v1
clusters: []
contexts: []
current-context: ""
kind: Config
preferences: {}
users: []
Why can't I start from a completely blank kubectl configuration and add my GCP cluster to it?