When does @timestamp get added?


I've got a logstash pipeline taking events from a HTTP poller input to an ES output, configured with persistent queue. The incoming events do not contain an @timestamp field, and we don't add one at any point in the pipeline.

My understanding, based mainly on comment by @magnusbaeck here Add field timestamp with current time - is that the @timestamp field is being added at the input plugin. However, we recently had an issue where ES was offline for a number of hours and as a result the events built up on the persistent queue. When ES was back on-line, all of the events from the intervening period were pushed through as expected, but they all went into ES with the timestamp set to current time, rather than the expected behaviour of the timestamp being the time at which the HTTP poller had obtained the events from source.

So my question is am I correctly understanding the expected behaviour, and if so, is this a general logstash issue, or an issue specific to the http poller input? Thanks!

  • Adrian

The @timestamp is added as the event is created just before the event is added to the queue. Technically speaking it is the codec in the input that creates the event and puts it in to the queue.
Once it is set it is not changed - only the date or ruby filters can do that.

Do you have a size limit on the PQ? If the PQ fills up then the codec and Input will block waiting for access. While this blocking happens no polling takes place.

Back-pressure is the only thing I can think of.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.