How to safely provide kubeconfig to kubectl

6/3/2019

Our users are allowed to access Kubernetes clusters only from the management station, there is no possibility to access the API directly from their laptops/workstations.

Every user posses kubeconfig with relevant secrets belonging to this particular user. As the kubeconfig also contains the token used to authenticate against the Kubernetes API, it is not possible to store the kubeconfig "as is" on the management station file system.

Is there any way how to provide the token/kubeconfig to kubectl e.g. via STDIN, not exposing it to other users (e.g. admin of the management station) on the file system?

-- Sl4dy
kubeconfig
kubectl
kubernetes

2 Answers

6/3/2019

Activate account and download credentials using a service account.

 gcloud auth activate-service-account --key-file=${PULL_KEYFILE} --project PROJECT_NAME
 gcloud container clusters get-credentials CLUSTER_NAME --zone ZONE
 //use kubectl as you would do
 kubectl create namespace ${NAMESPACE} --dry-run -o yaml | kubectl apply -f -
-- bern.ar
Source: StackOverflow

6/4/2019

So far I have used the following solution:

  • User specifies an empty token in the kubeconfig file
apiVersion: v1
kind: Config
preferences: {}
users:
 - name: foo.bar
  user:
    token:
  • User sets the TOKEN variable without echoing it
read -s TOKEN
  • User specifies the token as paramater to kubectl
kubectl --kubeconfig /home/foo.bar/kubeconfig --token $TOKEN get nodes
-- Sl4dy
Source: StackOverflow