I have a stream of json logs being sent into elastic via logstash.
The input stream has inconsistent date formats, in one of the either formats:
"YYYY-MM-dd'T'HH:mm:ssZ", or "YYYY-MM-dd HH:mm:ss"
Example, some of them are of this format:
"started": "2019-10-14T07:54:00-07:00",
"finished": "2019-10-14T07:54:30-07:00",
And some of them are of this format:
"started": "2019-10-14 16:06:39",
"finished": "2019-10-14 16:07:18",
I wrote this filter to match both:
date {
match => ["datestart", "YYYY-MM-dd'T'HH:mm:ssZ", "YYYY-MM-dd HH:mm:ss" ]
}
Which I thought should match both, but I get this error for the second format, when I start logstash.
"reason"=>"failed to parse date field [2019-10-14 16:06:39] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
Any suggestions?
I am exploring cutting off the timezone using grok as it doesnt really matter for my case, but it would be much easier if I could just handle both.