I have the following filter in my logstash pipeline:
date {
match => [ "[fields][serverrequest][starttime]", "UNIX_MS" ]
tag_on_failure => [ "serverrequest_logstyle_dateparsefailure" ]
}
However, the resulting document in ElasticSearch is showing a timestamp value which does not match the parsed field
e.g. document:
@timestamp: Jan 27, 2021 @ 11:11:14.891
fields.serverrequest.starttime: 1611722342306
Parsing that starttime on epochconverter.com gives me Wednesday, 27 January 2021 04:39:02.306 which is what I expect from the other document details, and also simply looking at the last 3 digits in the starttime field.
I can't see any backlog on the filebeat / logstash monitoring that would explain a ~7 hour delay, and I don't see any documents with a timestamp in the future.
I'm at a loss to explain what's going on.