How do I send logs to GELF UDP endpoint from Kubernetes on a per-pod basis

7/23/2018

I've recently started using kubernetes and am now looking at how I can configure centralised logging. For the majority of the pods, the application itself logs straight to the GELF endpoint (logstash), however there are a number of "management" pods which I need to get the logs from too.

Previously when I was using Docker Swarm I would simply add the log driver (and relevant configuration) into the compose file. However there doesn't seem to be that option in Kubernetes.

I looked at using Fluentd to read the logs straight from /var/log/containers, but I ran into a couple of issues here:

  1. There doesn't seem to be any easy way to specify which pods to log to logstash; I get that you can create filters etc but this doesn't seem very maintainable going forward, something using annotations on the individual pods seems more sensible.

  2. The logs in /var/log/containers are in the json-file log format, not GELF.

Is there any way in kubernetes to use the built in Docker logging driver on a per-pod basis to easily log to the GELF endpoint?

-- thewire247
docker
gelf
kubernetes

1 Answer

7/24/2018

Try to use fluentd with the Kubernetes metadata plugin to extract local json-file docker logs and send to Graylog2.

tag_to_kubernetes_name_regexp - the regular expression used to extract Kubernetes metadata (pod name, container name, namespace) from the current fluentd tag.

-- Akar
Source: StackOverflow