How to Restrict Logstash to process new records only after previous record is completely processed

Hi Team,

Whenever new record enters Logstash it should wait until previous record is completely processed only then new record should be processed by Logstash.
Please suggest me how to configure this in Logstash.


Try setting the batch size and pipeline workers to 1 (one). See

This is a weird requirement. You might get more relevant help if you explain the background.

My requirement is - Each time I get a record to logstash for processing I need to query elastic search for related records and copy certain fields from it to current record.
Whenever I load a file, related records exists within the file as well because of which I need the records to process one at a time or else I want the related events/records to wait till the current processing related event is completely processed and put to elastic search.

Please let me know if you need some more background so that I can explain.


How are you locating the related records, via a query in an elasticsearch filter?

Yes in Logstash Filter Plugin I am using Elasticsearch plugin and inside that I am using the query template pointing to the file which has elastic search query.

That'll never work reliably since queries don't have strong consistency guarantees. What that means in practice is that newly added documents don't become searchable for a few seconds (depending on the configuration). It's possible to force them to be available immediately by requesting a refresh operation, but that's too costly for each document.

I think you need to consider another option, e.g. preprocessing the input elsewhere or writing a custom Logstash plugin that both updates a cache somewhere with the current document but also looks up related documents in the same cache.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.