Hi,
I have setup filebeat to parse my API logs and send messages to a centralized logstash cluster.
filebeat will collect all the logs with a minimum CPU consumption.
logstash can transform the data, define multiple pipelines on dedicated servers and send the output to Elasticsearch.
When parsing the log, filebeat
stores each line of my log in key call messages and the value is a complex json object.
example :
"messages" : "{"key1":"value1","key2":"value2","key3":"value3","key4":"value4","key5":"value5"}"
sends the filebeat event to logstash which sends it elasticsearch
When ELK stores the document, it contains a key :
"messages" :" {"key1":"value1","key2":"value2","key3":"value3","key4":"value4","key5":"value5"}"
I would like to have each key / value as part of entry in the Elastic document
{
"@timestamp": "XXXX-XX-XXTHH:MM:SS",
"key1":"value1",
"key2":"value2",
"key3":"value3",
"key4":"value4",
"key5":"value5"
}
How can achieved that. I don't want to install logstash on each API server to parse logs (based on logback).
Regards,
Farid