Hi everyone. I am using Logstash Version: 9.1.0
I was debugging my output plugin and I found out that when using the default In-memory queue, Logstash would batch the events into smaller chunks of events.
I have script to write 100 records per second to a file I use as input for Logstash.
The output plugin receives the batched events and prints the number of events per batch.
Now here is what I found out while testing the different parameters in logstash.yml:
Default - In-memory Queue:
With 100 logs per second written to file and default logstash.yml configuration (no modifications), the batch sizes look like this: 20, 33, 27, 20 - total: 100 events.
Using Persistent Queue:
100 logs per second written to file (same as before), and the only change in logstash.yml is setting queue.type: persisted. The 100 records were batched in a single batch with a size of 100 events.
The goal:
I would like to know if there is a way to have the In-memory Queue batch events like Persistent Queue does? What I want to achieve is to have less small requests, and instead have a single request or batch with all those events (until they fill pipeline.batch.size: 100 for example).
What I have tried:
I already tried playing with these pipeline parameters in Logstash.yml:
- pipeline.batch.size
- pipeline.batch.delay
I’ve read that those two are the only parameters that modify the batch behavior, and pipeline.batch.size only sets the max number of events per batch, but there might be something I’m missing.
Here is my logstash .conf file in case is useful. my_plugin only uploads the events to an endpoint and prints to stdout the number of events in each batch.
input {
file {
path => "/path_to_test_logs/logs.txt"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
output {
my_plugin {}
}