We're using this date filter:
filter {
date {
match => [ "ts", "yyyy-MM-dd'T'HH:mm:ss'Z'.SSS", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSSSS'Z'", "ISO8601" ]
timezone => "UTC"
}
}
And that seems to work for our ISO8601 dates, but we're seeing this spamming our logstash logs:
[2019-11-26T18:18:20,636][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-6.3.0-2019.11.27", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x44858188], :response=>{"index"=>{"_index"=>"filebeat-6.3.0-2019.11.27", "_type"=>"_doc", "_id"=>"S-2mqm4B6hZBfKts_AtW", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ts] of type [date] in document with id 'S-2mqm4B6hZBfKts_AtW'. Preview of field's value: '2019-11-27T02:18:15Z.599'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-11-27T02:18:15Z.599] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
It seems like we should be correctly matching those dates, so I'm not sure what exactly we're doing wrong. The filter used to not have ISO8601 in it and we've just added it today, but I'm not sure how adding that would have caused this. Any ideas what we might be doing wrong?