Hi ,
I have specific log in a file log: Tue May 2 03:54:22 2017
I use a grok filter to extract the timestamp into a field of its own called "datestamp" "datestamp" => "Tue May 2 03:57:00 2017 ", "@timestamp" => 2017-05-15T14:25:05.490Z,
My filter here , {filter { date { match => ["datestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ"] target => "@timestamp" } }
i know how to replace the @timestamp by datestamp, but it doesn't work (dateparsefailure). I think it's my match , because the date format is not that , right ?
Correct. Your date pattern doesn't even resemble what your timestamp actually looks like. Note that the date filter logs what it's having trouble with.
It doesn't work , dateparsefailure again
Where can i find date filter logs for test file, because I test this file with logstash-f, so it doesn't write
in/ var/log/logstash...
It looks like you have a trailing space in the datestamp field, which is not accounted for in the pattern. Correct the pattern used to extract this field and I do not see why it would not work.
That seems to work. Remember that @timestamp is always in UTC. Instead of adding the space to the date filter pattern, I would recommend modifying the extraction pattern to get rid of it instead.
Timestamps in Elasticsearch are assumed to be in UTC, and a lot of functionality, including Kibana, relies on this. Trying to change this is therefore to ask for trouble.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.