Issue with date format

We're using this date filter:

filter {                                                                                              
  date {                                                                                                
    match => [ "ts", "yyyy-MM-dd'T'HH:mm:ss'Z'.SSS", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSSSS'Z'", "ISO8601" ]
    timezone => "UTC"                                                                                 
  }                                                                                                   
}                                                                                                     

And that seems to work for our ISO8601 dates, but we're seeing this spamming our logstash logs:

[2019-11-26T18:18:20,636][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-6.3.0-2019.11.27", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x44858188], :response=>{"index"=>{"_index"=>"filebeat-6.3.0-2019.11.27", "_type"=>"_doc", "_id"=>"S-2mqm4B6hZBfKts_AtW", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ts] of type [date] in document with id 'S-2mqm4B6hZBfKts_AtW'. Preview of field's value: '2019-11-27T02:18:15Z.599'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-11-27T02:18:15Z.599] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

It seems like we should be correctly matching those dates, so I'm not sure what exactly we're doing wrong. The filter used to not have ISO8601 in it and we've just added it today, but I'm not sure how adding that would have caused this. Any ideas what we might be doing wrong?

What is the format of "ts" field that you are using and how does it look like.

As per my understanding the "ts" field type is updated as Date in your index mapping. But when the "ts" value is not in the ISO8061 it is unable to parse the field. so check the timestamp value that generated for "ts". In the error that you posted it shows that "ts" received input as 2019-11-27T02:18:15Z.599 which cannot be parsed by logstash.

Yeah, I think you're right. I think the ts values we've been using in some applications has been parsing as a string and now that we're putting an ISO8601 value in there it's parsing as a date and they're conflicting. Bleh. I guess one of them is going to move to a different field then.

The date filter parses [ts] and updates @timestamp. It does not modify [ts]. elasticsearch expects [ts] to be a date, either because you have a template that says so, or because on the first document that had a [ts] field it was a LogStash::Timestamp.

However, none of the default date parsers in elasticsearch can parse 2019-11-27T02:18:15Z.599, so you need to add a custom parser in your index template.

Ahhh, ok, the custom parser seems like it might be the right way to go. Does this seem like the correct way to do that?

PUT /_template/date_mapping
{
  "index_patterns": ["*"],
  "mappings": {
    "properties": {
      "ts": {
        "type": "date",
        "format": "yyyy-MM-dd'T'HH:mm:ss'Z'.SSS||strict_date_optional_time||epoch_millis"
      }
    }
  }
}

That must not be the full extent of what I need to do because after doing so, and verifying that the mapping shows up in the list when I GET /_template, I'm still getting the same error.

Tried reindexing and that didn't work. We've already lost the data since the date mismatch discards the entry so I'll try deleting the index and letting elastic re-create it.

It looks right to me, although I do not have elasticsearch running so I am unable to test it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.