Event generated after timeout is not being pushed to elasticsearch


(Chandrashekar V) #1

Hi All,

I have following logstash configuration:

input {
file {
type => "fail"
path => "/filepath/ag.txt"
}
}

filter {
grok {
match => [ "message", "%{LOGLEVEL:loglevel} - %{NOTSPACE:user_id} - %{GREEDYDATA:msg_text}" ]
}

aggregate {
task_id => "%{user_id}"
code => "map['clicks'] ||= 0; map['clicks'] += 1;
map['several_clicks'] = false;
"
push_map_as_event_on_timeout => true
timeout_task_id_field => "user_id"
timeout => 30
timeout_tags => ['_aggregatetimeout']
timeout_code => "event.set('several_clicks', event.get('clicks') > 1);"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "%{type}_agindex"
action => "update"
document_id => "%{user_id}"
document_type => "doc"
doc_as_upsert => "true"
}
stdout { codec => rubydebug }
}


following is the input file:

INFO - 12345 - Clicked One
INFO - 12345 - Clicked Two
INFO - 12345 - Clicked Three
INFO - 12346 - Clicked One
INFO - 12346 - Clicked Two
INFO - 12346 - Clicked Three

All the events gets generated appropriately. For the above input, 2 events gets generated and 2 documents are created in elastic search.

After a timeout of 30 seconds, we get two more events as below:

{
"several_clicks" => true,
"tags" => [
[0] "_aggregatetimeout"
],
"clicks" => 3,
"@version" => "1",
"@timestamp" => 2018-05-16T10:25:33.024Z,
"user_id" => "12345"
}
{
"several_clicks" => true,
"tags" => [
[0] "_aggregatetimeout"
],
"clicks" => 3,
"@version" => "1",
"@timestamp" => 2018-05-16T10:25:33.025Z,
"user_id" => "12346"
}

These events are not pushed to elastic search.

This is same example as provided here : https://www.elastic.co/guide/en/logstash/current/plugins-filters-aggregate.html#plugins-filters-aggregate-example3

Please let me know what is missing in my configuration.

Thanks

V.Chandrashekar


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.