I would like to combine several consecutive events. For this purpose I consider the use of the aggregation filter as useful. Since I don't have a specific start or end event, I will use example #3 in the Logstash documentation.
If a new event is created from the map at the time of the timeout, it will unfortunately only contain the previously assigned fields. How is it possible to use all fields of the last aggregated event? I think I need something like map['_source'] = event.get('[_source]') in my code section. But these Codeline doesn't work.
background: In my log are ten consecutive entries with different parameters. I would like to combine these in one event. A unique task_id is present. The events occur in irregular intervals, so that the use of thepush_previous_map_as_eventdoes not seem to make sense to me.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.