Kubernetes json log parsing with filebeat

Hello everyone
We have project to ship some application log from Kubernetes cluster to elasticsearch..the problem is we have right now is the logs that kubernetes produce is json based which is located on /var/log/container/containername.log. the logs are generated on multi-line by default and it can be Handel by multi line plugin in filebeat.but the question is how can i decode the json?as far as i read it is possible by :
json.keys_under_root: true
json.add_error_key: true
json.message_key: log

these are some json logs:(NOTE-> the timestamp is where the logs is started..this can be used for multiline plugin)

{"log":"2019-11-04 17:46:48.269 +0000 - [INFO] - from application in application-akka.actor.default-dispatcher-4 \n","stream":"stdout","time":"2019-11-04T17:46:48.2700177Z"}
{"log":"message pushed; channels: 1 , subscribers:0 ,message: T,17:46:48\n","stream":"stdout","time":"2019-11-04T17:46:48.270053426Z"}
{"log":"\n","stream":"stdout","time":"2019-11-04T17:46:48.270061916Z"}
{"log":"2019-11-04 17:46:49.280 +0000 - [INFO] - from application in application-akka.actor.default-dispatcher-4 \n","stream":"stdout","time":"2019-11-04T17:46:49.280515039Z"}
{"log":"message pushed; channels: 1 , subscribers:0 ,message: T,17:46:49\n","stream":"stdout","time":"2019-11-04T17:46:49.280552967Z"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.