How to make filebeat ship logs in the same order as in the log file

I can see that filebeat is not shipping the logs in the same order to logstash as in the log file. So can i make filebeat to ship the logs in the same order as in the log file. Sorting it later in elasticsearch based on timestamp does not solve my need because i would like to make sure that my log for request from user is ordered before combining into a single event in logstash based on unique id.

Assuming you are outputting to single LS instance, using a single worker, and have async publishing disabled, I would expect the delivery of log messages from a single file to be in order (possibly interleaved with message from other log files being read). Could it happening on the Logstash side? Based on the execution model described on that page, I think it could.

In general Beats does not provide order delivery guarantees, only an at-least once delivery guarantee.

Would it be possible to use multiline on the Beats side to combine the related log lines into a single event before sending it to Logstash? Then once at logstash you could parse all the data from one event. This would require that the related log lines are not interleaved with other log messages.


I was researching and found I dont think this cannot be done by filebeat. Please see this post Filebeat multiline by Queue ID

Also, i found the below in the documentation,

If you are using a Logstash input plugin that supports multiple hosts, such as the beats input plugin, you should not use the multiline codec to handle multiline events. Doing so may result in the mixing of streams and corrupted event data. In this situation, you need to handle multiline events before sending the event data to Logstash.

If you think it logstash might pick up threads in disorder how can i make in order.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.