Combining two events in one


(Vasiliy) #1

Hello over there!

I'm looking for a way to combine two logs into one event. Say I have log lines that look like:

[timestamp] [unique id] [some message]
... // different format/type log messages goes here
[timestamp] [unique id] [a message]

I need to make one event instead of 2 combining first and last events if their unique ids matches. Other log lines between has to be processed their own way and it's something I can do with if statement. I wonder if aggregate + drop filters is something I actually need to use.

Thanks in advance,
Vasiliy


(Mark Walkom) #2

Probably the aggregate filter.


(Vasiliy) #3

Aggregate + drop filters resolved my issue.


(Mark Walkom) #4

What does the config look like? It might be helpful for someone else in the future :slight_smile:


(Vasiliy) #5

Consider "requestMethod" and "responseStatus" is something coming from "message". "message" is a rest part of log event after timestamp and unique id like [timestamp] [unique id] [message].

  if [uniqueId] =~ /.+/ {
    if [requestMethod] =~ /.+/ {
     aggregate {
       task_id => "%{uniqueId}"
       code => "map['requestContent'] = event['message']"
       map_action => "create"
       timeout => 120
     }
     drop {}
    } else if [responseStatus] =~ /.+/ {
     aggregate {
       task_id => "%{uniqueId}"
       code => "event['requestContent'] = map['requestContent']"
       map_action => "update"
       end_of_task => true
     }
    }
  }

(Nikolay Shushkin) #6

There is a limitation in aggregate usage: You should be very careful to set logstash filter workers to 1 (-w 1 flag) for this filter to work correctly otherwise events may be processed out of sequence and unexpected results will occur

But reduce filter workers to 1 impacts performance.
But there is additional detail is known about the input files: all unique IDs needed for aggregation are in the same file.
Is it possible to setup logstash processing somehow with multiple workers (e.g. equal to the input files amount) so that every worker processed only one file?

If no then any advise is appreciated.


(Rodrigo Araujo Cavalcante) #7

Hi! can you help me please, I'm newbie in LS and FB.
I'm using filebeat do send a log.txt to logstash and I'll save some value fields, for example:
FB sending two events, event one contains a value field XPO and I save a variable "word1",
event2: sending two values "NG, A000" and capture "word2=NG" and "word3=A000" , But I wish to put together the 3 information captured in a single event, how can I do it, could you show me an example please.


(Mark Walkom) #8

It would be better if you start your own thread for this request :slight_smile:


(Vasiliy) #9

Aggregate plugin would help you if you have something common in all 3 log messages. See my example above in original topic. I have some unique id spread throw few log messages. Using this data I can combine few logs in one.


(system) #10