I’m trying to add centralized logging on my kubernetes cluster. I have installed EFK stack using as an example. All is working fine, but the logs from my application are in string format and I want to write logs in elastic search as json.
I found a solution using a filter in fluentd, something like this:
<filter kubernetes.**>
@type parser
key_name log
hash_value_field parsed
<parse>
@type json
</parse>
</filter>
My problem now is that not all pods are logging as json format and this generates parse exceptions in fluentd.
How can I filter these logs that are json and keep other logs unchanged?
Thanks