I am new to Kubernetes
cluster setup. I have a Kubernetes setup running on AWS EC2
instance with RHEL 8
. I had tried to access Kubernetes dashboard via localhost but its not working.
1.I had setup kubernetes cluster on AWS EC2 Instance using kubeadm. 2.I am able to deploy pods in the cluster. 3.Deployed Kubernetes dashboard and trying to access it via localhost and from outside but i am not getting the dashboard login window. 4.Tried with cluster IP,changed to NodePort and LoadBalancer. 5.Able to run kubectl proxy and getting kubectl cluster-info.
Please anyone guide me how to deploy and access kubernetes dashboard from AWS EC2 Instance.
Thanks in Advance
[root@managernode ~]# cat kubernetes-dashboard.yaml
---
apiVersion: v1
kind: ServiceAccount
metadata:
labels:
k8s-app: kubernetes-dashboard
kubernetes.io/cluster-service: "true"
addonmanager.kubernetes.io/mode: Reconcile
name: kubernetes-dashboard-admin
namespace: kube-system
[root@managernode ~]# kubectl get pods --namespace=kube-system
NAME READY STATUS RESTARTS AGE
calico-kube-controllers-564b6667d7-qglhj 1/1 Running 0 22h
calico-node-4cpv9 1/1 Running 0 22h
coredns-5644d7b6d9-c8vj4 1/1 Running 0 22h
coredns-5644d7b6d9-l8qft 1/1 Running 0 22h
etcd-managernode 1/1 Running 0 22h
kube-apiserver-managernode 1/1 Running 0 22h
kube-controller-manager-managernode 1/1 Running 0 22h
kube-proxy-fhfk7 1/1 Running 0 22h
kube-scheduler-managernode 1/1 Running 0 22h
kubernetes-dashboard-7c54d59f66-gvslw 1/1 Running 0 22h
[root@managernode ~]# kubectl proxy
Starting to serve on 127.0.0.1:8001
Despite service account you need to define cluster role binding:
apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRoleBinding
metadata:
name: eks-admin
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: cluster-admin
subjects:
- kind: ServiceAccount
name: kubernetes-dashboard-admin
namespace: kube-system
To make sure you did everything in proper order and follow every steps, read documentation: aws-kubernetes-dashboard.