I have filebeat configured to send some modules to logstash and on logstash I have a pipeline that handles syslog (file from the documentation)
I want to add some more custom logs on filebeat and I'm wondering what's the best approach to parse this. Since my pipeline config for syslog os on One file should I separate input, filter and output and create filter config for each thing I want?
I've read some options working on 6.x saying that this is solved using two filebeat instances, is this still the best option for 7.x?
You could use the same instance of filebeat to send data to logstash. You can add multiple tags to the various inputs and then use conditionals to write specific filters. (https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html)
You could also use multiple beats or syslogs and implement multiple pipelines as described here.
Thank you, gonna give it a try
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.