Hoping someone can assist me with my issue below:
I have Logstash conf setup to use the csv input plugin. The data inputs a date field with value like follows…
2024-01-09 22:21:04
I then have this logic in the filter plugin to handle the date…
date {
match => ["cart_received_timestamp", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"]
target => "@timestamp"
}
I am getting the following error (I’m using strict in my index to reject invalid data). The input date-
2024-01-09 22:21:04
… and the index expectation-
yyyy-MM-dd HH:mm:ss||yyyy-MM-dd'T'HH:mm:ss.SSS'Z'
… are the same.
It seems like logstash converts my date to the format below:
'2024-01-09T22:21:04.567414642Z'
… causing the error as it does not match the index mapping for this field’s required format.
{"update"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:539] failed to parse field [@timestamp] of type [date] in document with id 'punchout_cart_item_182439'. Preview of field's value: '2024-01-09T22:21:04.567414642Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2024-01-09T22:21:04.567414642Z] with format [yyyy-MM-dd HH:mm:ss||yyyy-MM-dd'T'HH:mm:ss.SSS'Z']", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}
I’ve tried various changes to the format in the conf file (like ISO8601, adding the convert option to change the field to date_time, ChatGPT recommended some ruby option with code the change the date format … none changed the error condition.