Logstash date filter truncating milliseconds when they are set to 000

We use the following filter

date {
  match => [ "timestampInUtc" , "UNIX_MS" ]
  target => "timestamp"
  timezone => "UTC"
}

This works correctly except when the milliseconds are set to 000. In that case, they get truncated

For instance a timestamp of 1678895447001 will correctly get converted to 2023-03-15T15:50:47.001Z
But a timestamp of 1678895447000 will get converted to 2023-03-15T15:50:47Z. The milliseconds are dropped

Using logstash 8.3.3

Would that be.a bug? Any workaround?

Thanks

Yes. Fixed by this commit.

It is unclear what you want from a workaround. The LogStash::Timestamp object has milliseconds (or nanoseconds) set to zero. The issue is when that is converted to a string. When are you doing the string conversion?

If it is being converted when sending to elasticsearch then, according to yauuie's PR ...

Workaround: without this patch, the Elasticsearch field's mapping would need to be adapted to add date_time_no_millis , which accepts a value that does not have fractional digits

1 Like

Thanks for the details. I'll just upgrade. No worries

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.