kubectl logs -f gets "Authorization error"

2/7/2019

I recently created a cluster on EKS with eksctl. kubectl logs -f mypod-0 bumps into Authorization error:

Error from server (InternalError): Internal error occurred: Authorization error (user=kube-apiserver-kubelet-client, verb=get, resource=nodes, subresource=proxy) Any advice and insight is appreciated

-- Kok How Teh
amazon-eks
authorization
kubernetes

1 Answer

2/16/2019

You would need to create a ClusterRoleBinding with a Role pointing towards the user : kube-apiserver-kubelet-client

kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: kubelet-api-admin
subjects:
- kind: User
  name: kube-apiserver-kubelet-client
  apiGroup: rbac.authorization.k8s.io
roleRef:
  kind: ClusterRole
  name: system:kubelet-api-admin
  apiGroup: rbac.authorization.k8s.io

kubelet-api-admin is usually a role that has the necessary permissions, but you can replace this with an apt role.

-- C0d3ine
Source: StackOverflow