I am trying to get my kubernetes dashboard authenticated using the keycloak identity provider but getting the invalid bearer token. Here are the details.
For keycloak, i have already setup a client - gatekeeper, user - alice part of group - developers. Also did the mapping of the User attribute 'name' and Group membership attribute 'groups'. I can test the setup successfully using the kubectl command line utility, but unable to succeed through the dashboard.
When i hit the url - kubernetes-dashboard.localdev.me:8081/, i am able to authenticate with keycloak and can load the kubernetes dashboard but i am getting Unauthorized notification and cannot list my resources in the dashboard even though i have provided the required level of permissions using the RBAC to user 'alice'.
Kubernetes cluster
NAME STATUS ROLES AGE VERSION INTERNAL-IP EXTERNAL-IP OS-IMAGE KERNEL-VERSION CONTAINER-RUNTIME
kubemaster Ready control-plane,master 11d v1.23.1 192.168.122.54 <none> Ubuntu 20.04.3 LTS 5.11.0-43-generic docker://20.10.12
kubenode Ready <none> 11d v1.23.1 192.168.122.198 <none> Ubuntu 20.04.3 LTS 5.11.0-43-generic docker://20.10.12
Ingress controller -
https://kubernetes.github.io/ingress-nginx/
API server configuration for keycloak IDP
...
- --oidc-issuer-url=https://kubemaster:8443/auth/realms/local
- --oidc-client-id=gatekeeper
- --oidc-username-claim=name
- --oidc-groups-claim=groups
- --oidc-ca-file=/etc/kubernetes/ssl/kubemaster.crt
...
keycloak docker server
version: '3.8'
services:
keycloak:
#image: rsk-internal-docker.dkrreg.mmih.biz/risk-keycloak:15.0.1-1
image: quay.io/keycloak/keycloak:16.1.0
environment:
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: admin
PROXY_ADDRESS_FORWARDING: "true"
ports:
- "8080:8080"
- "8443:8443"
volumes:
- "$PWD/tls.key:/etc/x509/https/tls.key"
- "$PWD/tls.crt:/etc/x509/https/tls.crt"
- "$PWD/keycloak-latest-db:/opt/jboss/keycloak/standalone/data"
kubernetes dashboard - the recommended.yml definition file from the kubernetes documentation.
kubectl apply -f https://raw.githubusercontent.com/kubernetes/dashboard/v2.4.0/aio/deploy/recommended.yaml
gatekeeper oauth proxy
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
k8s-app: gatekeeper-proxy
name: gatekeeper-proxy
namespace: kubernetes-dashboard
spec:
replicas: 1
selector:
matchLabels:
k8s-app: gatekeeper-proxy
template:
metadata:
labels:
k8s-app: gatekeeper-proxy
spec:
containers:
- command:
- /opt/keycloak-gatekeeper
- --discovery-url=https://192.168.122.54:8443/auth/realms/local
- --client-id=gatekeeper
- --client-secret=jZzvJ0wCDDwltV3tAf0SXSbVoKXM1RqV
- --listen=0.0.0.0:3000
- --encryption-key=vGcLt8ZUdPX5fXhtLZaPHZkGWHZrT6aa
- --redirection-url=https://kubernetes-dashboard.localdev.me:8081/
- --enable-refresh-tokens=true
- --upstream-url=https://kubernetes-dashboard
- --skip-openid-provider-tls-verify=true
- --secure-cookie=false
image: keycloak/keycloak-gatekeeper:latest
#image: carlosedp/keycloak-gatekeeper:latest
imagePullPolicy: Always
name: gatekeeper-proxy
ports:
- containerPort: 3000
protocol: TCP
name: http
---
apiVersion: v1
kind: Service
metadata:
labels:
k8s-app: gatekeeper-proxy
name: gatekeeper-proxy
namespace: kubernetes-dashboard
spec:
ports:
- name: http
port: 3000
protocol: TCP
targetPort: 3000
selector:
k8s-app: gatekeeper-proxy
---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
annotations:
#nginx.ingress.kubernetes.io/auth-url: "https://$host/oauth2/auth"
#nginx.ingress.kubernetes.io/auth-signin: "https://$host/oauth2/start?rd=$escaped_request_uri"
#kubernetes.io/ingress.class: nginx
nginx.ingress.kubernetes.io/rewrite-target: /
nginx.ingress.kubernetes.io/proxy-buffer-size: "64k"
#cert-manager.io/cluster-issuer: ca-issuer
name: gatekeeper-proxy
namespace: kubernetes-dashboard
spec:
ingressClassName: nginx
rules:
- host: kubernetes-dashboard.localdev.me
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: gatekeeper-proxy
port:
number: 3000
tls:
- hosts:
- kubernetes-dashboard.localdev.me
secretName: kubernetes-dashboard-ingress-tls
gatekeeper proxy logs - showing generating and injecting the token
1.641220737766937e+09 info starting the service {"prog": "keycloak-gatekeeper", "author": "Keycloak", "version": "7.0.0 (git+sha: f66e137, built: 03-09-2019)"}
1.6412207377670407e+09 info attempting to retrieve configuration discovery url {"url": "https://192.168.122.54:8443/auth/realms/local", "timeout": "30s"}
1.6412207377766109e+09 info successfully retrieved openid configuration from the discovery
1.641220737778449e+09 info enabled reverse proxy mode, upstream url {"url": "https://kubernetes-dashboard"}
1.6412207377785714e+09 info using session cookies only for access and refresh tokens
1.6412207377785907e+09 info adding a default denial into the protected resources
1.641220737778598e+09 info protecting resource {"resource": "uri: /*, methods: DELETE,GET,HEAD,OPTIONS,PATCH,POST,PUT,TRACE, required: authentication only"}
1.6412207377788239e+09 info keycloak proxy service starting {"interface": "0.0.0.0:3000"}
1.6412207747293563e+09 info accces token for user has expired, attemping to refresh the token {"client_ip": "192.168.1.107:42948", "email": "alice@stack.com"}
1.6412207747479768e+09 info injecting the refreshed access token cookie {"client_ip": "192.168.1.107:42948", "cookie_name": "kc-access", "email": "alice@stack.com", "refresh_expires_in": 1800, "expires_in": 299.252029216}
API server log -
E0103 14:43:23.960726 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
E0103 14:43:23.961244 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
E0103 14:43:23.962304 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
E0103 14:43:23.991455 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
E0103 14:43:23.991526 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
E0103 14:43:23.991602 1 authentication.go:63] "Unable to authenticate the request" err="invalid bearer token"
ClusterRole and ClusterRoleBinding
kind: ClusterRole
apiVersion: rbac.authorization.k8s.io/v1
metadata:
name: developer-role
rules:
- apiGroups: [""]
resources: ["namespaces","pods"]
verbs: ["get", "watch", "list"]
---
kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
name: developer-crb
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: developer-role
subjects:
- kind: Group
name: "developers"
Please suggest if i am missing anything here. If there is any compatibility issues with versions that i am currently using for kubernetes cluster, keycloak server and gatekeeper proxy server. If there is anyway i can troubleshoot more on this. please suggest.
Thanks Sudhir
The image 'keycloak/keycloak-gatekeeper:latest' is no longer being supported and as per the suggestion from the keycloak portal i have opted for oauth2-proxy as the implemention.
Here is the solution details for kubernetes authentication using oauth2-proxy and keycloak oidc as resolved by self - https://stackoverflow.com/questions/70584157/unable-to-load-kubernetes-dashboard-after-successful-oauth2/70705961#70705961