Malformed dateformat


We are seeing errors on malformed dateformats on our Logstash hosts even though we should be matching the pattern.
Logstash: 2.4
Elasticsearch: 2.4

The Logstash Config:

if [message] {
grok {
patterns_dir => [ "/etc/logstash/patterns/grok-patterns" ]
#Wildfly PHP Apache
match => { "message" => ["\A%{TIMESTAMP_ISO8601:logdate}",
date {
match => [ "logdate", "ISO8601", "yyyy/MM/dd HH:mm:ss", "dd-MMM-yyyy HH:mm:ss ZZZ", "dd/MMM/yyyy:HH:mm:ss Z" ]
add_tag => [ "date_modified" ]

The error data:

{:timestamp=>"2016-10-05T04:27:02.179000+0200", :message=>"Failed parsing date from field", :field=>"logdate", :value=>"2016-10-05 04:26:54", :exception=>"Invalid format: "2016-10-05 04:26:54" is malformed at "16-10-05 04:26:54"", :config_parsers=>"ISO8601,yyyy/MM/dd HH:mm:ss,dd-MMM-yyyy HH:mm:ss ZZZ,dd/MMM/yyyy:HH:mm:ss Z", :config_locale=>"default=en_US", :level=>:warn}

Have anyone experienced the same?

1 Like

It seems the ISO8601 date pattern requires milliseconds. Either tack on ",000" at the end of logdate or add a new pattern based on the ones below but without the ".SSS" suffix.


Of course we just needed to add "yyyy-MM-dd HH:mm:ss" instead of "yyyy/MM/dd HH:mm:ss"

Thank you.