match => [ "timestampInUtc" , "UNIX_MS" ]
target => "timestamp"
timezone => "UTC"
This works correctly except when the milliseconds are set to 000. In that case, they get truncated
For instance a timestamp of 1678895447001 will correctly get converted to 2023-03-15T15:50:47.001Z
But a timestamp of 1678895447000 will get converted to 2023-03-15T15:50:47Z. The milliseconds are dropped
It is unclear what you want from a workaround. The LogStash::Timestamp object has milliseconds (or nanoseconds) set to zero. The issue is when that is converted to a string. When are you doing the string conversion?
If it is being converted when sending to elasticsearch then, according to yauuie's PR ...
Workaround: without this patch, the Elasticsearch field's mapping would need to be adapted to adddate_time_no_millis , which accepts a value that does not have fractional digits