Send logs From Filebeat to 2 different logstash servers simultaneously

Hi Team,

My requirement is to configure filebeat output as two logstash servers and I need to send logs to defined 2 servers simultaneously. It is not load balance scenario. Both logstash servers should received same data. Could you please help me.

Filebeat output conf:

#-------------------------- Kafka Output ----------------------------------
output.kafka:
#initial brokers for reading cluster metadata
hosts: ["130.99.212.103:9092", "61.85.172.183:9092"]

#message topic selection + partitioning
topic: 'testlog'
partition.round_robin:
reachable_only: false

required_acks: 1
compression: gzip
max_message_bytes: 1000000

Error log from 2nd logstash:

[2019-12-19T23:03:55,292][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

Note: i'm getting log from filebeat to 1st logstash.

Any one please help me

Your requirement can be fulfilled by configuring two different output.logstash sections, each one with a different logstash host, as seen in the documentation https://www.elastic.co/guide/en/beats/filebeat/current/logstash-output.html

Your output, however, shows a sample kafka output configuration
If you are using kafka as message broker between filebeats and logstash indexers, the logical step should be to configure kafka topics and consumer groups to allow each message to be consumed by different logstash indexers. In that case, you should ask for kafka support.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.