Logstash multi pipeline optimal configuration

Greetings!

I am currently running Logstash 7.9.2, on a production node, with Logstash being the only one of the stack installed there. My node has 8 cores. I am running a multiple pipeline configuration, with some pipelines being more demanding in terms of processing events and the amount of events they handle.

I have a few questions on the topic:

  1. Should I configure specific pipeline.workers and pipeline.batch.size to each one in the pipelines.yml file or just specify a number of workers and batch size in the logstash.yml file? I am pretty sure the first option is the right one, but any more help would be appreciated
  2. Is there a rule regarding to how many workers should be assigned to a pipeline, based on the events/second? Because I could not find anything....:frowning:
  3. If I do not specify a worker and batch size on some pipelines, will the get the default values I have specified in the logstash.yml, or will that setting override my configuration in the pipelines.yml file?

Thanks in advance,
Dimitris

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.