Here I take logs use case as an example, Basically, we'll collect these system logs, application logs, business logs for each application on our production, and ship them to different logstash clusters instead of a logstash cluster.
which means we want filebeat send logs to different logstash cluster for each type of logs. but filebeat only can send data to a destination by default.
We don't want everything goes through a dedicate logstash cluster,
Is there a way can make a filebeat send data to multiple logstash clusters or how to run filebeat with multiple instances on a server in terms of different types of logs?
I'll appreciate that if Elastic Team can provide a solution with details.
It would be better if can provide of how to implement that step by step with instructions
I know Nxlog supports to send the data to multiple destinations, But I really want ship logs with filebeat.
One Filebeat instance is only able to send to one output. If you would like to forward events to multiple Logstashes, Elasticsearches, etc. you need to start multiple instances of Filebeat with different configuration.
Step by step getting started guide for Filebeat: https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-getting-started.html
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.