Greppy
October 3, 2015, 4:49am
1
For some reason, I can't get the Elapsed time filter to work just right.
From the debug log it with logstash 1.5.4 it appears that my start and end event are getting picked up correctly along with the timestamps but the elapsed.time calculation is way off...
Any thoughts on where I should look..
IE
Start Time=>"2015-08-28T07:49:23.334Z"
End Time=>"2015-08-28T07:49:57.904Z"
Elapsed.timestamp_start" => "2015-08-28T07:49:23.334Z"
Elapsed.time => 3096668.469 ?????
Below is an output of debug logs: (removing some lines to reduce the post)
"@timestamp " => "2015-08-28T07:49:23.334Z",
"time" => "2015-08-28 07:49:23.334",
"tags" => [
[0] "Time",
[1] "Received",
[2] "start_tag"
],
"start_time" => "2015-08-28 07:49:23.334",
"task_id" => "0eb29fc3-06a6-4f15-a628-c910b9ad08e2",
"debug" => "timestampMatched"
Elapsed, 'end event' received {:end_tag=>"end_tag", :unique_id_field=>"task_id", :level=>:info
"@version " => "1",
"@timestamp " => "2015-08-28T07:49:57.904Z",
"time" => "2015-08-28 07:49:57.904",
"tags" => [
[0] "Time",
[1] "end_tag",
[2] "elapsed",
[3] "elapsed.match"
],
"end_time" => "2015-08-28 07:49:57.904",
"task_id" => "0eb29fc3-06a6-4f15-a628-c910b9ad08e2",
"elapsed.time" => 3096668.469,
"elapsed.timestamp_start" => "2015-08-28T07:49:23.334Z",
"debug" => "timestampMatched"
Greppy
October 3, 2015, 5:05am
2
Also here's a slimmed down version of my config:
input {
lumberjack {port => 5050 type => "logs" ssl_certificate => "lumberjack.crt" ssl_key => "lumberjack.key"}
}
filter{
multiline{
pattern => "^<TRACE"
what => "previous"
negate=> true
}
mutate{
gsub=> ["message","\r\n","
"
]
}
grok {
break_on_match => false
match => [
"message", "%{TIMESTAMP_ISO8601:time}"
]
add_tag => "Time"
}
grok {
match => ["message", "TRACE date="%{TIMESTAMP_ISO8601:start_time}".Received message.(Task: (?<task_id>. ))"]
add_tag => [ "start_tag" ]
}
grok {
match => ["message", "TRACE date="%{TIMESTAMP_ISO8601:end_time}".Processed message.(Task: (?<task_id>. ))"]
add_tag => [ "end_tag"]
}
elapsed {
start_tag => "start_tag"
end_tag => "end_tag"
unique_id_field => "task_id"
timeout => "400"
new_event_on_match => false
}
date {
timezone => "Etc/UCT"
match => [ "time", "YYYY-MM-dd HH:mm:ss.SSS" ]
target => "@timestamp "
}
}
output {
elasticsearch {
action => "index"
embedded => false
embedded_http_port => "9200-9300"
flush_size => 5000
idle_flush_time => 1
index => "logstash-%{+YYYY.MM.dd}"
manage_template => true
workers => 1
host => "localhost"
}
stdout { codec => rubydebug }
}
inderjeet26
(Inderjeet Singh)
November 30, 2015, 10:34pm
3
I am having this same issue, while suing the elapsed filter. The time is way off. It should be in seconds, but the number "elapsed_time" => 18694775.362, suggests something very big in my case too!!
DavidL
(David Li)
February 9, 2016, 7:39pm
4
Hi, I have the same issue, was this ever solved for you? thanks a lot for sharing.
I too running with same issue, any pointers are greatly appreciated. My log input is "|" separated like and filtering with CSV filter, details below
Log file -
2015-01-28 07:20:21,517|hostnameA|requestId1|IN
2015-01-28 07:20:22,500|hostnameA|requestId1|OUT
2015-01-28 09:06:38,042|hostnameA|requestId2|IN
2015-01-28 09:06:39,767|hostnameA|requestId2|OUT
Filter
csv{
columns => ["logdate","server","reqid","type"]
separator => "|"
}
if[type] == "IN" {
mutate { add_tag => ["serviceStarted"] }
} else if[type] == "OUT" {
mutate { add_tag => ["serviceTerminated"] }
}
elapsed {
start_tag => "serviceStarted"
end_tag => "serviceTerminated"
unique_id_field => "reqid"
timeout => 10000
new_event_on_match => false
}
can any one help to identify the time diff b/n one pair of IN & OUT,
jakpol
(Jakub)
October 18, 2016, 12:22am
6
Hi guys, i solved this issue easly by invoking the filter date before elapsed filter. Otherwise if elapsed is first it will take the first @timestamp for the current date.
Hope it helps