So I've been creating logstash pipelines for a while now and this is the first time I came across an issue I couldn't find a solution online or debug further, so I decided to post here.
I'm getting '_dateparsefailure' in my output tags while trying to parse a simple ISO8601 timestamp from a json I'm receiving from AWS Kinesis, that's my pipeline:
I've already set logstash's log do trace but there is not a single dateparsefailure hint in the logs.
Trying to match "YYYY-MM-dd'T'H:mm:ss.SSS'Z'" results in the same problem. No 'newtimestamp' field comes up in my output file and I get the dateparsefailre in the tags.
A date filter parses a string. It cannot parse a LogStash::Timestamp. The @timestamp field is special. If a json filter sees a field called @timestamp it will try to parse it into a LogStash::Timestamp...
Hello Badger, thanks for the insight. I did consider this before, so I tried to copying (with mutate filter) the value of @timestamp to another field, then applying the date parse on that field, but got the same result.
Also, the json I'm receiving from Kinesis has another field called timestamp (without the '@'), the value of this field has no double quotes, so I guess this is the field Logstash considering as a timestamp ?
It's type will not change when you copy it. You could mutate+convert it to be a string, and then parse it, but why would you want to that? It is already parsed!
A mutate filter does operations in a fixed order and convert comes before copy, so the timestamp field would not exist when the convert executes, so it is a no-op. Split the mutate into two filters.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.