Parse stdout of containers in Openshift

2/14/2020

I'm a bit of a noob when it comes to Openshift/Kubernetes, so I hope that I can explain my use case well enough.

I have a Docker image of a small "program", it can be summarised as the following: it tails -f a log file, and when something that matches a regex specified in config is found, it sends an email to someone. Something like monitoring a production log of an application and sending an email with a stacktrace to the Ops team if one is printed in the application log.

This works fine when my application log is produced via let's say a Java EE on Weblogic app running on the host, or a log produced by another docker container but saved to a bind mounted filesystem, therefore accessible (if mounted) by the "monitoring" container.

Now I need to deploy this image to monitor 2-3 microservices running in Openshift. These microservices log to stdout, and I do not have privileges to mount a path owned by root like /var/lib/docker/containers/[container-id]/[container-id]-json.log to the "monitoring" container.

What is the best way to be able to parse the microservices logs? I am allowed to change their logback config to have them log to stdout AND to an emptyDir volume for example, and create a Pod with the microservice+volume+monitoring container.

Thank you everyone for your suggestions, Roberto

-- Roberto
docker
kubernetes
openshift

2 Answers

2/26/2020

how can I forward Logs to a single file on my Filesystem?

-- pablo
Source: StackOverflow

2/14/2020

There's fluentd for Docker as log agreggator. However, you cannot install only fluentd with openshift's ansible installer (see deploying-efk). You can install fluentd logging driver on docker host manually (see fluentd-logging-driver). There's 'getting started' about docker host configuration. Then you can use any output plugin to store logs

-- Oligzeev
Source: StackOverflow