I have installed Kubernetes Dashboard on a Kubernetes 1.13 cluster as described here:
kubectl apply -f https://raw.githubusercontent.com/kubernetes/dashboard/v1.10.1/src/deploy/recommended/kubernetes-dashboard.yamlI've also configured the dashboard to serve its content insecurely because I want to expose it via an Ingress at https://my-cluster/dashboard where the TLS connection will be terminated at the ingress controller.
service/kubernetes-dashboard in namespace kube-system and changed ports from {port:443, targetPort:8443} to {port:80, protocol:TCP, targetPort:9090}.deployment/kubernetes-dashboard in namespace kube-system and changed ports from {containerPort: 8443, protocol:TCP} to {containerPort: 9090, protocol:TCP} (and livenessProbe analagously). I have also changed args from [ --auto-generate-certificates ] to [ --enable-insecure-login ].This allows me to contact the dashboard from a cluster node via HTTP at the service's cluster IP address and port 80 (no Ingress is configured yet).
I have also created a sample user as explained here and extracted its token. The token works e.g. in kubectl --token $token get pod --all-namespaces, so it apparently possesses cluster-admin privileges. However, if I enter the same token into the dashboards' login screen I get "Authentication failed. Please try again.".
What could be the reason why? How can I further diagnose and solve the issue? (The dashboard's log does not provide any help at this point.)
UPDATE If I keep the dashboard's standard configuration (i.e. for secure access over HTTPS) the same token is accepted.