Aggregate - concatenate events

12:12:12 processed T1
data1 val1
data2 val2
dataId 1111
data7 val7

I have configured filebeat to send multiline events like above to be sent to logstash.

In logstash, I am using grok and aggregate to extract the dataId value to combine multiple multiline events.

aggregate {
    task_id => "%{idValue}"
    code => "event.set('message', event.get('message') + map['message'])"

I don't want individual events to appear in elasticsearch - I tried event.cancel() but didn't seem to work.

I tried timeout and inactivity_timeout configurations.

Using the aggregate plugin to zip the messages back together is problematic because ordering is not strictly maintained; the multiline messages will need to be grouped on the Filebeat side before being transmitted to Logstash.

If you are sending multiline events to Logstash, use the options described here to handle multiline events before sending the event data to Logstash. Trying to implement multiline event handling in Logstash (for example, by using the Logstash multiline codec) may result in the mixing of streams and corrupted data.

-- Filebeat Reference

How is Filebeat configured? specifically, what are the values of the multiline.* fields?

I have already taken care of filebeat and I'm getting multiple lines as 1 event. I'm trying to combine multiple of those based on an identifier. I'm able to group them but instead of getting only the aggregated one, I am getting 1, 1-2, 1-2-3 and so on.

How do you know when an event is "done" and ready to be emitted? There are several generalised examples in the aggregate filter plugin docs that cover the various ways of configuring the plugin depending on what you expect.

Do you have example pipeline configuration that you can share?

I have a conf file with input, filter and output.

In filter, I have grok and aggregate.

I am using timeout and inactivity_timeout to aggregate events.

How do I avoid duplicates and get only the aggregated events?

Do I need to use drop plugin if I dont want individual events to be sent to elasticsearch?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.