zozo6015
(Peter)
November 23, 2018, 1:55am
1
Hello I have a rule where I am trying to convert the timestamp from the logfile into @timestamp . However since logstash 6.5.0 that is not working anymore which results that data not being imported in elasticsearch.
The rules look like this:
if [year] {
mutate {
add_field => {
"timestamp_match" => "%{month} %{day} %{year} %{time} %{day_period}"
}
remove_field => [ "month", "day", "year", "time", "day_period" ]
}
mutate {
convert => { "timestamp_match" => "string" }
}
date {
match => [ "timestamp_match",
"MMM dd YYYY KK:mm:ss aa",
"MMM dd YYYY K:mm:ss aa" ]
timezone => "UTC"
target => "@timestamp"
}
the result is looks like this: "@timestamp" => 2018-11-22T19:16:23.000Z
Any idea how to fix this issue?
Eniqmatic
(Lewis Barclay)
November 23, 2018, 8:13am
2
Does it look like this in Kibana or in raw elastic?
zozo6015
(Peter)
November 23, 2018, 10:35am
3
I cannot see any of the data with that timestamp format in kibana. I am assuming that it's not loaded into elasticsearch or kibana cannot show it since it has no valid timestamp.
Eniqmatic
(Lewis Barclay)
November 23, 2018, 10:38am
4
So where are you seeing the result?
Eniqmatic
(Lewis Barclay)
November 23, 2018, 10:43am
6
Can you output the timestamp_match field also and see how that looks?
zozo6015
(Peter)
November 23, 2018, 10:45am
7
Nov 23 05:45:04 vps188864 logstash[28555]: "timestamp_match" => "Nov 23 2018 5:45:03 AM",
Nov 23 05:45:04 vps188864 logstash[28555]: "input" => {
Nov 23 05:45:04 vps188864 logstash[28555]: "type" => "log"
Nov 23 05:45:04 vps188864 logstash[28555]: },
Eniqmatic
(Lewis Barclay)
November 23, 2018, 10:48am
8
Sorry one more question, what do you want it to look like?
zozo6015
(Peter)
November 23, 2018, 10:49am
9
I don't care as long it is a valid timestamp which makes the data to be loaded into elasticsearch and will show up in kibana.
Eniqmatic
(Lewis Barclay)
November 23, 2018, 10:58am
10
Can you try the following:
date {
match => [ "timestamp_match",
"MMM dd YYYY hh:mm:ss aa",
"MMM dd YYYY h:mm:ss aa" ]
timezone => "UTC"
target => "@timestamp"
}
Eniqmatic
(Lewis Barclay)
November 23, 2018, 11:01am
11
Also, your filter is currently working, so I don't understand why you don't like the converted timestamp?
zozo6015
(Peter)
November 23, 2018, 11:04am
12
It is not like I don't like it. It just prevents data loading up in elasticsearch.
Eniqmatic
(Lewis Barclay)
November 23, 2018, 11:09am
13
I don't understand how, it is a correct and valid timestamp as far as I can see. Its the same as my timestamp. What errors do you get?
zozo6015
(Peter)
November 23, 2018, 11:09am
14
Not working. timestamp format is the same and I cannot see the data loaded up in kibana
Nov 23 06:08:58 vps188864 logstash[6968]: "@timestamp" => 2018-11-23T06:08:50.000Z,
zozo6015
(Peter)
November 23, 2018, 11:11am
15
That is the trick no errors, The data is just does not show up in kibana. Only if I keep the timestamp generated by filebeat at the import time. It has few seconds of delay.
zozo6015
(Peter)
November 23, 2018, 11:14am
16
Just as an information for the amount of missing data which is not loaded up in kibana http://prntscr.com/llzmq3
Eniqmatic
(Lewis Barclay)
November 23, 2018, 11:19am
17
Can you try changing the target in the date filter to a new field to see?
zozo6015
(Peter)
November 23, 2018, 11:24am
18
That is what I was doing as a workaround.
Eniqmatic
(Lewis Barclay)
November 23, 2018, 11:29am
19
Interesting, so I wonder if it doesn't like the timestamp field being overwritten.
Could you try a "remove field" on the timestamp field directly before your date plugin?
zozo6015
(Peter)
November 23, 2018, 11:33am
20
It does not like it.
#<LogStash::Error: timestamp field is missing>, :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:177:in `sprintf'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:68:in `event_action_tuple'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `block in multi_receive'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:114:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:97:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:373:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:372:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:324:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:286:in `block in start_workers'"]}
Nov 23 06:32:31 vps188864 logstash[7882]: [2018-11-23T06:32:31,472][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Nov 23 06:32:31 vps188864 systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 06:32:31 vps188864 systemd