Further details of how a pipeline configuration works

I wondering about my pipeline setup:

- pipeline.id: bal
  path.config: "/usr/share/logstash/pipeline/1*.conf"
  pipeline.workers: 24
  pipeline.batch.size: 2000
- pipeline.id: soul
  path.config: "/usr/share/logstash/pipeline/2*.conf"
  pipeline.workers: 24
  pipeline.batch.size: 2000
- pipeline.id: outputs
  path.config: "/usr/share/logstash/pipeline/9*.conf"
  pipeline.workers: 8
  pipeline.batch.size: 2000

For instance, i have set pipeline workers to 24 for input and parsing, for two independent redis inputs, with a output set to 8 workers. My question starts here. am I creating a bottleneck by having 24 works for input and filtering while output has only 8?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.