Configure file beat to multiple output

Hi Team,

We have a requirement where we are sending logs from the db using filebeat to elasticsearch cluster and Kafka cluster based on the type of the log.

For Example: If the log type is INFO we need to send it to Elasticsearch if it is ERROR we need to send it to kafka cluster for further processing.

So if we want to send the data from filebeat to multiple outputs. Can we do it without using Logstash.

Please let us know what all options do we have.

2 Likes

Unfortunately, running multiple outputs in Filebeat is not supported.

However, you could run multiple instances of Filebeat reading the same files. For example one Filebeat instance could be reading the files and dropping every non INFO level log lines. Then it would forward the collected events to Elasticsearch. The other instance could only read ERROR level lines and forward it to Kafka.

Example configurations:

Filebeat 1 sending INFO to Elasticsearch:

filebeat.inputs:
- type: log
  enabled: true
  paths:
  - /var/log/*.log
  include_lines: "*INFO*"
output.elasticsearch:
  hosts: ["your-es:9200"]

Filebeat 2 sending ERRORs to Kafka:

filebeat.inputs:
- type: log
  enabled: true
  paths:
  - /var/log/*.log
  include_lines: "*ERROR*"
output.kafka:
  hosts: ["your-kafka"]

Make sure you configure different data and log paths when running two Filebeats in parallel.

See more about:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.