There are some ways to collect docker or k8s containter logs:
Because the docker's log driver can not support multiline,the logs in /var/lib/docker/containter//.json is line by linelog driver should support multiline.
After a few days of searching, I think the fluent is the best tools to solve this question and find two ways to solve the problem. But I still solve my problems.
The origin log:
{"log":"2018-04-19 14:19:57,915 INFO [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream":"stdou
t","time":"2018-04-19T06:19:57.916259717Z"}
{"log":"2018-04-19 14:19:57,915 INFO [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream
":"stdout","time":"2018-04-19T06:19:57.916265977Z"}
{"log":"2018-04-19 14:20:43,446 ERROR [FixedTimeScheduler] com.testjavatest.fastdemo.task.JobTask: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed o
ut (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448436321Z"}
{"log":"org.apache.http.conn.HttpHostConnectException: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed out (Connection timed out)\n","stream":"s
tdout","time":"2018-04-19T06:20:43.448475801Z"}
{"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151)\n","stream":"stdout","time":"2018-04-19T06:20:43.4484860
6Z"}
{"log":"\u0009at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)\n","stream":"stdout","time":"2018-04-19T06:20:43.448492586
Z"}
{"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)\n","stream":"stdout","time":"2018-04-19T06:20:43.448498085Z"}
{"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)\n","stream":"stdout","time":"2018-04-19T06:20:43.448503302Z"}
{"log":"\u0009at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.4485085Z"}
{"log":"\u0009at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)\n","stream":"stdout","time":"2018-04-19T06:20:43.448527373Z"}
{"log":"\u0009at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)\n","stream":"stdout","time":"2018-04-19T06:20:43.448532363Z"}
{"log":"\u0009at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.44853704Z"}
{"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)\n","stream":"stdout","time":"2018-04-19T06:20:43.448541864Z"}
{"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)\n","stream":"stdout","time":"2018-04-19T06:20:43.448546452Z"}
{"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)\n","stream":"stdout","time":"2018-04-19T06:20:43.448551288Z"}
{"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:308)\n","stream":"stdout","time":"2018-04-19T06:20:43.448555115Z"}
{"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:264)\n","stream":"stdout","time":"2018-04-19T06:20:43.448559028Z"}
{"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvBytes(MyHttpClient.java:470)\n","stream":"stdout","time":"2018-04-19T06:20:43.448563048Z"}
{"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvString(MyHttpClient.java:474)\n","stream":"stdout","time":"2018-04-19T06:20:43.448567524Z"}
{"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvJSON(MyHttpClient.java:483)\n","stream":"stdout","time":"2018-04-19T06:20:43.448571872Z"}
{"log":"\u0009at com.testjavatest.fastdemo.task.JobTask.run(JobTask.java:70)\n","stream":"stdout","time":"2018-04-19T06:20:43.448576282Z"}
{"log":"\u0009at com.testjavatest.fastdemo.task.timer.FixedTimeScheduler.run(FixedTimeScheduler.java:43)\n","stream":"stdout","time":"2018-04-19T06:20:43.44858108Z"}
{"log":"Caused by: java.net.ConnectException: Connection timed out (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448585665Z"}
{"log":"\u0009at java.net.PlainSocketImpl.socketConnect(Native Method)\n","stream":"stdout","time":"2018-04-19T06:20:43.448590045Z"}
{"log":"\u0009at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)\n","stream":"stdout","time":"2018-04-19T06:20:43.448596151Z"}
{"log":"\u0009at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)\n","stream":"stdout","time":"2018-04-19T06:20:43.448600888Z"}
{"log":"\u0009at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)\n","stream":"stdout","time":"2018-04-19T06:20:43.448605225Z"}
{"log":"\u0009at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)\n","stream":"stdout","time":"2018-04-19T06:20:43.448609704Z"}
{"log":"\u0009at java.net.Socket.connect(Socket.java:589)\n","stream":"stdout","time":"2018-04-19T06:20:43.448614111Z"}
{"log":"\u0009at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:337)\n","stream":"stdout","time":"2018-04-19T06:20:43.448618537Z"}
{"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)\n","stream":"stdout","time":"2018-04-19T06:20:43.4486230
34Z"}
{"log":"\u0009... 17 more\n","stream":"stdout","time":"2018-04-19T06:20:43.448627326Z"}
{"log":"2018-04-19 14:20:57,934 INFO [FixedTimeScheduler] com.testjavatest.fastdemo.config.WsJobTask: ws heartbeat run\n","stream":"stdout","time":"2018-04-19T06:20:57.934646438Z"}
The expect output in gelf:
one java event is one not multiline.
My fluent config is:
<source>
@id fluentd-containers.log
@type tail
path "/var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log"
pos_file "/var/log/fluentd-containers.log.pos"
time_format %Y-%m-%dT%H:%M:%S.%NZ
tag "raw.docker.*"
format json
read_from_head true
<parse>
time_format %Y-%m-%dT%H:%M:%S.%NZ
@type json
time_type string
</parse>
</source>
<match raw.docker.**>
@id raw.docker
@type detect_exceptions
remove_tag_prefix "raw"
message "log"
stream "stream"
languages java, python
multiline_flush_interval 5
max_bytes 0
max_lines 0
</match>
<match docker.**>
@type copy
<store>
@type "gelf"
protocol "udp"
host "192.168.2.4"
port 12206
flush_interval 5s
<buffer>
flush_mode interval
retry_type exponential_backoff
flush_interval 5s
</buffer>
</store>
<store>
@type "stdout"
</store>
</match>
But this can not work, could someone give me a hand?
my fluent config is:
<source>
@id fluentd-containers.log
@type tail
from_encoding UTF-8
encoding UTF-8
path /var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log
pos_file /var/log/fluentd-containers.log.pos
time_format %Y-%m-%dT%H:%M:%S.%NZ
tag raw.docker.*
format json
read_from_head true
</source>
<filter raw.docker.**>
@type grep
regexp1 stream stdout
@type concat
key log
multiline_start_regexp /^\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d+/
continuous_line_regexp /^\s+/
separator ""
flush_interval 3s
</filter>
<filter raw.docker.**>
@type parser
key_name log
inject_key_prefix log.
<parse>
@type multiline_grok
grok_failure_key grokfailure
<grok>
pattern /%{TIMESTAMP_ISO8601:log_time}%{SPACE}%{LOGLEVEL:log_level}%{SPACE}\[%{DATA:threadname}\]%{SPACE}%{DATA:classname}%{SPACE}:%{SPACE}%{GREEDYDATA:log_message}/
</grok>
</parse>
</filter>
<match raw.docker.**>
@type copy
<store>
@type gelf
protocol udp
host 10.111.2.4
port 12204
flush_interval 5s
</store>
<store>
@type stdout
</store>
<store>
@type file
path /var/log/test
</store>
</match>
But this can not work, could someone give me a hand? thanks.
PS: I want to know is there a better way to collect docker logs ?