I have a K8s deployment that deploys an app's container (an ASP.Net Core application) along its sidecar for logging. I use Serilog's Console sink in order to log in the stdout. I also have a Config Map in order to store the fluentd's configuration. A very nice article about this part is this.
What I'm trying to do is to use the sidecar to forward the logs from the app to Elastic Search. I use the respective output plugin in order to do that. But what should use in source and filter tags in order to achieve that?
The fluentd's configuration is the following:
<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<filter *.**="">
@type parser
key_name log
format json
reserve_data true
</filter>
<match *.**="">
@type copy
<store>
@type elasticsearch
host 192.168.1.41
port 9200
logstash_format true
logstash_prefix fluentd
logstash_dateformat %Y%m%d
include_tag_key true
user elastic
index_name "ap-*"
password xxxxxxxxx
type_name access_log
tag_key @log_name
flush_interval 1s
</store>
<store>
@type stdout
</store>
</match>
Using this configuration, fluentd is working but I don't get any messages forwarded to Elastic Search.
Here are the logs of fluentd
2021-04-08 11:37:25 +0000 [info]: parsing config file is succeeded path="/fluentd/etc/fluent.conf"
2021-04-08 11:37:25 +0000 [info]: gem 'fluent-plugin-elasticsearch' version '5.0.2'
2021-04-08 11:37:25 +0000 [info]: gem 'fluentd' version '1.12.2'
2021-04-08 11:37:25 +0000 [info]: 'flush_interval' is configured at out side of <buffer>. 'flush_mode' is set to 'interval' to keep existing behaviour
2021-04-08 11:37:25 +0000 [info]: using configuration file: <ROOT>
<source>
@type forward
port 24224
bind "0.0.0.0"
</source>
<filter *.**="">
@type parser
key_name "log"
format json
reserve_data true
<parse>
@type json
</parse>
</filter>
<match *.**="">
@type copy
<store>
@type "elasticsearch"
host "192.168.1.41"
port 9200
logstash_format true
logstash_prefix "fluentd"
logstash_dateformat "%Y%m%d"
include_tag_key true
user "elastic"
index_name "catalogapi-*"
password xxxxxx
type_name "access_log"
tag_key "@log_name"
flush_interval 1s
<buffer>
flush_interval 1s
</buffer>
</store>
<store>
@type "stdout"
</store>
</match>
</ROOT>
2021-04-08 11:37:25 +0000 [info]: starting fluentd-1.12.2 pid=7 ruby="2.6.6"
2021-04-08 11:37:25 +0000 [info]: spawn command to main: cmdline=["/usr/local/bin/ruby", "-Eascii-8bit:ascii-8bit", "/usr/local/bundle/bin/fluentd", "-c", "/fluentd/etc/fluent.conf", "-p", "/fluentd/plugins", "--under-supervisor"]
2021-04-08 11:37:26 +0000 [info]: adding filter pattern="*.**=\"\"" type="parser"
2021-04-08 11:37:26 +0000 [info]: adding match pattern="*.**=\"\"" type="copy"
2021-04-08 11:37:26 +0000 [info]: #0 'flush_interval' is configured at out side of <buffer>. 'flush_mode' is set to 'interval' to keep existing behaviour
2021-04-08 11:37:26 +0000 [warn]: #0 Detected ES 7.x: `_doc` will be used as the document `_type`.
2021-04-08 11:37:26 +0000 [info]: adding source type="forward"
2021-04-08 11:37:26 +0000 [info]: #0 starting fluentd worker pid=16 ppid=7 worker=0
2021-04-08 11:37:26 +0000 [info]: #0 listening port port=24224 bind="0.0.0.0"
I've made numerous attempts, with various configuration but nothing worked.
Thanks in advance for your time.