Hi,
my pipeline looks like this: log -> filebeat -> redis -> logstash -> elasticsearch
My logfile is a simple json file which should be ready to index at elasticsearch directly, so logstash doesn't really need to much with this logfile. The log event may contain a field named message, but there may be log events where message field is empty (e.g. if the log entry is some kind of metrics).
Now I noticed, that filebeat is sending the log line as field message.
So in Logstash I need to parse field message to json. If my encapsulated json includes a field named message, the message field is updated with the inner message. If my original log event has no field message, then the message field stays (which is the filebeat json string). But in this case, I do not want to keep the "filebeat message".
What options do I have?
Is there a way to tell filebeat for a specific logfile to store the log line in field xy instead of message?
A workaround which comes to my mind is to rename the field message to myMessage, then parse myMessage to json and delete myMessage if no json parse error is in tags.
So what can you recommend? I want to have it as performant as possible. The logLines can be really long (stacktraces and many other information), so I want to have a solution as slim as possible (in reference to performance / hardware resources).
Thanks, Andreas