Getting error from kubenetes server while logging in - ClusterRoleBinding

4/5/2020

I am using Keycloak as my identity provider for kubernetes. I am using kubelogin to get the token. The token seems to work but I am getting the below error. I think there is some issue in the ClusterRoleBinding which is not allowing it to work.

  • Whats the error
Error from server (Forbidden): pods is forbidden: User "test" cannot list resource "pods" in API group "" in the namespace "default"

Additional Information

  • Api Manifest
    - --oidc-issuer-url=https://test1.example.com/auth/realms/kubernetes
    - --oidc-username-claim=preferred_username
    - --oidc-username-prefix=-
    - --oidc-groups-claim=groups
    - --oidc-client-id=kubernetes
    - --oidc-ca-file=/etc/ssl/certs/ca.crt
  • Cluster role and cluster role binding
kind: ClusterRole
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: cluster-admin
rules:
- apiGroups: ["*"]
  resources: ["*"]
  verbs: ["*"]

---

kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: admin-rolebinding
subjects:
- kind: User
  name: //test1.example.com.com/auth/realms/kubernetes#23fd6g03-e03e-450e-8b5d-07b19007c443
  apiGroup: rbac.authorization.k8s.io
roleRef:
  kind: ClusterRole
  name: cluster-admin
  apiGroup: rbac.authorization.k8s.io

Is there anything I am missing to get this to work?

-- Shash
keycloak
kubernetes
rbac

1 Answer

4/7/2020

After digging a lot I could find the issue. Rather than adding the keycloak url for the user, we have to use the user name itself. Here is the example yaml

kind: ClusterRole
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: cluster-admin
rules:
- apiGroups: ["*"]
  resources: ["*"]
  verbs: ["*"]

---

kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: admin-rolebinding
subjects:
- kind: User
  name: test
  apiGroup: rbac.authorization.k8s.io
roleRef:
  kind: ClusterRole
  name: cluster-admin
  apiGroup: rbac.authorization.k8s.io
-- Shash
Source: StackOverflow