Converting filebeat message with logstash


I have setup filebeat to parse my API logs and send messages to a centralized logstash cluster.

filebeat will collect all the logs with a minimum CPU consumption.
logstash can transform the data, define multiple pipelines on dedicated servers and send the output to Elasticsearch.

When parsing the log, filebeat

stores each line of my log in key call messages and the value is a complex json object.
example :

  "messages" : "{"key1":"value1","key2":"value2","key3":"value3","key4":"value4","key5":"value5"}"

sends the filebeat event to logstash which sends it elasticsearch

When ELK stores the document, it contains a key :

 "messages" :" {"key1":"value1","key2":"value2","key3":"value3","key4":"value4","key5":"value5"}"

I would like to have each key / value as part of entry in the Elastic document

  "@timestamp": "XXXX-XX-XXTHH:MM:SS",

How can achieved that. I don't want to install logstash on each API server to parse logs (based on logback).



What does your logstash configuration looks like?

You need to parse the message in logstash using a json filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.