We have a requirement where we are sending logs from the db using filebeat to elasticsearch cluster and Kafka cluster based on the type of the log.
For Example: If the log type is INFO we need to send it to Elasticsearch if it is ERROR we need to send it to kafka cluster for further processing.
So if we want to send the data from filebeat to multiple outputs. Can we do it without using Logstash.
Please let us know what all options do we have.
Unfortunately, running multiple outputs in Filebeat is not supported.
However, you could run multiple instances of Filebeat reading the same files. For example one Filebeat instance could be reading the files and dropping every non INFO level log lines. Then it would forward the collected events to Elasticsearch. The other instance could only read ERROR level lines and forward it to Kafka.
Filebeat 1 sending INFO to Elasticsearch:
- type: log
Filebeat 2 sending ERRORs to Kafka:
- type: log
Make sure you configure different data and log paths when running two Filebeats in parallel.
See more about:
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.