Kubernetes Showing error "User "system:anonymous" cannot get at the cluster scope."

6/13/2017

I have created the kubernetes cluster on AWS EC2 using kubeadm, I can see all the nodes connected and my deployment and service also working. Even when I expose my deployment I can access it from outside the cluster but when I tries to access the kubernetes api from outside or locally I get the error

"User "system:anonymous" cannot get at the cluster scope."

My cluster info shows this :

Kubernetes master is running at https://172.31.25.217:6443 
KubeDNS is running at https://172.31.25.217:6443/api/v1/proxy/namespaces/kube-system/services/kube-dns

172.31.25.217 is the local IP of the cluster

I am using the latest version of kubectl and kubeadm

kubectl version
Client Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.4", GitCommit:"d6f433224538d4f9ca2f7ae19b252e6fcb66a3ae", GitTreeState:"clean", BuildDate:"2017-05-19T18:44:27Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.4", GitCommit:"d6f433224538d4f9ca2f7ae19b252e6fcb66a3ae", GitTreeState:"clean", BuildDate:"2017-05-19T18:33:17Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"linux/amd64"}
ubuntu@ip-172-31-25-217:/etc/kubernetes/manifests$ kubeadm version
kubeadm version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.4", GitCommit:"d6f433224538d4f9ca2f7ae19b252e6fcb66a3ae", GitTreeState:"clean", BuildDate:"2017-05-19T18:33:17Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"linux/amd64"}

Even if I try to run kubectl proxy and access the dashboard from outside the cluster on the IP : http://MASTER_IP:8001/ui, I am unable to do that and it displays connection refused.

What is the trick I am missing ? Can anyone help me ?

Kubectl Config View :`

kubectl config view
apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: REDACTED
    server: https://172.31.17.145:6443
  name: kubernetes
contexts:
- context:
    cluster: kubernetes
    user: kubernetes-admin
  name: kubernetes-admin@kubernetes
current-context: kubernetes-admin@kubernetes
kind: Config
preferences: {}
users:
- name: kubernetes-admin
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED

`

-- Anshul Jindal
amazon-ec2
amazon-web-services
kubeadm
kubectl
kubernetes

1 Answer

6/13/2017

I am able to solve my problem of Dashboard not accessible from outside cluster using the command:

kubectl proxy --address='0.0.0.0' --port=8001 --accept-hosts='^*
#x27;
-- Anshul Jindal
Source: StackOverflow