I can grok the data successfully with:
%{DATA:process} (%{INT:processinstance}) [%{TIMESTAMP_ISO8601:appserver_timestamp} %{DATA:action}]%{GREEDYDATA:message}
The result of appserver_timestamp in the above example is:
2016-01-31T01:00:18.707
I have a date filter setup as follows:
date {
timezone => "America/New_York"
match => [ "appserver_timestamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
}
However, I continue to get the "_dateparsefailure" tag on my events in Kibana. Can anyone spot the issue? Also, is there such thing as a grok debugger for date parsing like this one here.
For anyone that comes across this, the fix that worked for me was setting ISO8601 as the match value:
date {
timezone => "America/New_York"
match => [ "appserver_timestamp", "ISO8601" ]
}
Guys, I have tried all of that, nothing helped in my case. I am getting the _dateparsefailure error in every case and date plugin cannot recognize any format I tried until now.
I created the custom pattern and added in my /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/grok-patterns file:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.