The whole idea is flawed. All log messages pertaining to a Postfix queue id might accumulate over several days. Attempting to merge them all into a single event using any kind of multiline filter is a mistake.
Hello and thanks for your reply,
yes they might accumulate over several days but 95%+ are go through in about 2 seconds. For the worst cases I have put "max_age => 120".
Hello Magnus, you say using the multiline option to tie events is a mistake. Are there other alternatives? especially in a multi-thread environment? Honestly if I find no other altenatives, I'll look for other solutions, like parsing the json format before data is indexed. Grok is really limited option..
Using logstash-logback-encoder for instance allows to encode messages logged in JSON, and then send them via a TCP socket to Logstash. The JSON content would be already structured..and yes I've been told the join option is not supported by Logstash. It is really a shame that cannot be done via grok.
Using logstash-logback-encoder for instance allows to encode messages logged in JSON, and then send them via a TCP socket to Logstash. The JSON content would be already structured..
Yes, that's preferred to parsing text.
and yes I've been told the join option is not supported by Logstash.
Actually, my filter workers are set to multiple which makes it impossible to use any of the multiline or aggregate filters. Can't lower the number to 1. And the reason why I can't use logstash-logback-encoder for the moment is that I am using my server to archive my logs, so I need to receive the logs in the original format and not JSON structured.
I really wish someone could help me and shed light on this because I am really at loss at how to tie together the related events with filters or scripts if it is feasable, to display them on KIBANA. I've been looking for a solution for months now. Sorry I am using this page to get answers, I am unable to get them elsewhere.
I'd keep the Logstash parsing simple and have it emit low-level events that correspond to the actual events. Then I'd feed those, probably via a broker, to a service (possibly Logstash again) that correlates events and emits high-level events with all information about e.g. an email transaction.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.