Hi - we have a situation where different log files are being sent to diff logstash outputs and it's getting out of control with more than 7-8 instances running at the moment on a bunch of the key servers. I am trying to consolidate all these filebeat instances into one, possibly.
One idea was to have a master logstash which would look for a specific string in the source message, and then route to appropriate downstream logstash pipelines. Instead, is there a way to send the output to multiple from filebeat process itself - say, if message contains "abc" send to abc logstash pipeline etc. I did see that it can be done through ingest pipelines - but that would be to skip logstash and we don't want to excessive load on ingest nodes - given their current sizing.
How are others dealing with this kind of situation? Any ideas can be helpful in us making the right decision.
Thanks,
Veeresh