Multiple pipelines in one logstash and kafka


currently we are using following chain:
logfile -> filebeat -> logstash -> elasticsearch.

Sometimes I see in our logs, that filebeat cannot connect or is rejected by logstash.
Thats why I am considering introducing kafka in our stack, so that we have following chain:

logfile -> filebeat -> kafka -> logstash -> elasticsearch.

We have about 25-30 logfile types represented by a field "logType". They need completely different parsing. Currently we are using conditionals and only the main pipeline.

When configuring kafka output in filebeat I see the option to set the logType field as topic.
For logstash I am thinking of setting one pipeline for a topic. So I also can drop the conditionals to check which logfiles are to process.

Is it the right / a common approach / practice to do so?
Are there any pitfalls?
Do I need to expect any issues when a single logstash is processing about 30 kafka topics?
(currently we use a single instance)

Thanks a lot, Andreas

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.