Logstash Failed to Parse Date (6 milliseconds)

"reason"=>"failed to parse date field [Sun Feb 20 03:36:11.782065 2022] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

Since this datestamp has 6 miliseconds, I'm having trouble parsing it. Here is what I have in my logstash config:

    date {
      match => ["logdate", "dd/MMM/yyyy:HH:mm:ss Z", "EEE MMM dd HH:mm:ss.SSS yyyy"]
      target => "logdate"

Below is the log format I used to try and parse a log with 6 milliseconds.

EEE MMM dd HH:mm:ss.SSS yyyy

I did read this thread (How to read microseconds in grok like hh:mm:ss.SSSSSS) but am a little confused as hope I would mutate with gsub since this date stamp does not end with the milliseconds.

Use EEE MMM dd HH:mm:ss.SSSSSS yyyy (even in 7.x). In 8.0 logstash supports microsecond precision. Some of the underlying classes support nanosecond precision, but not the date filter. A workaround to support nanosecond precise @timestamp values is described in this SO post.

Hi Badger, I am running Logstash 8.0.0. To clarify, does the 8.0.0 date filter support microseconds?

That depends what you mean by "support".

input { generator { count => 1 lines => [ 'Sun Feb 20 03:36:11.782065 2022' ] } }
filter {
    date { match => [ "message", "EEE MMM dd HH:mm:ss.SSSSSS yyyy" ] }

will result in

"@timestamp" => 2022-02-20T08:36:11.782Z,

So it can parse 6 digits of sub-second precision, but it will only use 3 of them.

Perfect, microsecond precision isn't needed. Thanks you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.