Cannot get kubectl logs on worker nodes

5/31/2019

I created a cluster of k8s consisting of one master node and three worker nodes. (master node can do master and worker role). Everything was good. I can deploy pod, exec pod, get logs... but I have some log problem. I cannot get logs from worker node. I only get logs from master node kubeclt logs <pod-name>

This is my error msg:

error: You must be logged in to the server (the server has asked for the client to provide credentials ( pods/log nginx))

It occurred only worker node scheduled pod. Help me..

-- WOOJIN NA
kubernetes

1 Answer

5/31/2019

On your master node check kubelet logs

journalctl -u kubelet

There you might see something like

Flag --some-flag has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag

Remove those flags and put it into the kubelet config file /var/lib/kubelet/config.yaml

Also the default Authentication and Authorization to Kubelet's API differs between cli args and config files, so you should make sure to set the "legacy defaults" in the config file to preserve existing behavior.

Here is the snippet of my config.yaml

authentication:
  anonymous:
    enabled: false #set true to enable
  webhook:
    cacheTTL: 2m0s
    enabled: true
  x509:
    clientCAFile: /etc/kubernetes/pki/ca.crt
authorization:
  mode: Webhook
  webhook:
    cacheAuthorizedTTL: 5m0s
    cacheUnauthorizedTTL: 30s
-- A_Suh
Source: StackOverflow