Filebeat Output Condition for multiple (ELK) logstash server

(Arun Prasath) #1

There are two environments - DEV/TEST (ENV1) and PROD (ENV2) hosted in two separate Cloud environment respectively. Both have no internet connectivity. For these two environments, we have separate ELK setup for Nonprod and Prod.

There is another Linux server having filebeat-5.2.2-1.x86_64 installed in Redhat 7.2 in which there are set of log files specific to NonProd and Prod environment. The log files having format like nonprod-ddmmyyy.log and prod-ddmmyy.log.

I have below configuration in filebeat.yml


  • input_type: log


    • /var/log/custom/log/nonprod-*
      tags: ["nonprod-env"]
      document_type: custom_service1
    • /var/log/custom/log/prod-*
      tags: ["prd-env"]
      document_type: custom_service2

pretty: true

if "prd-env" in [tags] {
hosts: [""]
if "nonprod-env" in [tags] {
hosts: [""]

logging.level: warning

The above code is not working. If i have only hosts line like below in output section, it is working.

hosts: [""]
Kindly suggest it is possible to output to multiple servers. if possible then what will be the correct if then else code to achieve this task. I hope there should be some way and experts can suggest best possible solution for this.

(Pier-Hugues Pellerin) #2


If I understand correctly the servers are responsable to run both the PRODUCTION and the NONPROD applications and you want to use 1 Filebeat configuration to route the events to two different and independant cluster depending on the configured tags?

The Logstash output is not able to do that since it doesn't support conditional to do dynamic routing based on event data.

What I have seen a few solutions in the past to solve this kind of problems:

  1. Use 2 FB instance with independent configuration to point to the right hosts.
  2. Send all data to logstash that will buffer the events to a backend queues (like kafka), theses queue will be read by non prod logstash or prod logstash.
  3. Use the kafka output and send use the topic: '%{[type]}' option to choose a dynamic topic based on the data and configure logstash to read the right topics.

Is any of theses solutions could work for you?


(Steffen Siering) #3

This ticket adds some context:

When doing event routing or replication in filebeat, any output could create back-pressure in filebeat, potentially blocking filebeat from processing any logs. This creates some indirect coupling between the production and the non-production environment. E.g. if non-production is down/slow, the production environment would be affected in the very same way. Having 2 filebeat instances or using LS/Kafka to decouple the systems is very well recommended.

(Arun Prasath) #4

My cloud design is setup as two completely independent cloud for nonprod and prod. Using only ELK and unfortunately no kafka in use. Having filebeat out of these two cloud network to push the custom specific logs files for nonprod and prod to independent cloud platforms.

What if push all the logs to two ELK setup (two clouds) using the below output hosts. Will this work? I prefer pushing the events to two ELK logstash if possible and as a last option to use the another filebeat instance. Is this something we can expect in future releases?

hosts: [""]
hosts: [""]

(Steffen Siering) #5

For your use-case you will need 2 separate filebeat instances. I can't tell if we will support this use case anytime in the future, but it's definitely something we will discuss.

(Arun Prasath) #6

@ pierhugues - For running 2 instance of filebeat, what are the correct possible options.

  1. Already filebeat installed in default path and using its default /var/lib/filebeat/registry file. Is it possible to run another filebeat with same installation with different registery and filebeat.yml file.
  2. We have to install and configure another filebeat and run it.

(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.