Aggregate filter plugin not working

i copied the exact same example on the documentation page:
https://www.elastic.co/guide/en/logstash/current/plugins-filters-aggregate.html#plugins-filters-aggregate-example1

with the filter and made the setup in logstash , but it is not working?
any idea what to do?

What do you mean by "it is not working"? What is the configuration and what is the result?

logs:

INFO - 12345 - TASK_START - start
INFO - 12345 - SQL - sqlQuery1 - 12
INFO - 12345 - SQL - sqlQuery2 - 34
INFO - 12345 - TASK_END - end

logstash config:

input {
beats {
port => "5044"
}
}
filter {
grok {
match => [ "message", "%{LOGLEVEL:loglevel} - %{NOTSPACE:user_id} - %{GREEDYDATA:msg_text}" ]
add_tag => ["grok done"]
}

aggregate {
task_id => "%{user_id}"
code => "map['clicks'] ||= 0; map['clicks'] += 1;"
push_map_as_event_on_timeout => true
timeout_task_id_field => "user_id"
timeout => 600 # 10 minutes timeout
timeout_tags => ['_aggregatetimeout']
timeout_code => "event.set('several_clicks', event.get('clicks') > 1)"
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "merge004"
}
}

output:

1 Like

The events in Kibana have a taskid. The logstash configuration would produce user_id. So that is not the output of that configuration. Also, your data has both start and end markers, so why use the configuration for data that does not?

Also, are you using --pipeline.workers 1 or -w 1?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.