Hi,
I have around 5 pipelines running on a logstash node sending output to elasticsearch (4 node cluster).
two of the pipelines require high throughput in some scenarios .
usually we received around 1000 events/ sec each in pipeline a & b . In some cases we get storms and received around 8000+ events/s (this could go upto 20,000 events/s) ,
this is my pipeline.yml setting
- pipeline.id: a
pipeline.workers: 10
pipeline.batch.size: 1000
pipeline.batch.delay: 10 - pipeline.id: b
pipeline.workers: 10
pipeline.batch.size: 1000
pipeline.batch.delay: 10
This is fine during normal times but during a storm we get large queues and delays in indexing.
How do I set these pipeline settings to get max thoughput from logstash.
Thanks
Anu