Filebeat output to multiple ElasticSearch


(Alain) #1

Hi,
I have a server RHEL7 with filebeat client installed. I want to send the syslog to a cluster of ElasticSearch (elk01) and the logs of Nginx to another one (elk02). What's the config for that. The first one wotk great. But I don't know how to configure the module for Nginx.


  • module: nginx
    setup.kibana.host: ["elk02:5601"]
    output.elasticsearch.hosts: ["elk02:2900"]

    Access logs

    access:
    enabled: true
    var.paths: ["/var/log/nginx/*_access.log"]

    Error logs

    error:
    enabled: true
    var.paths: ["/var/log/nginx/*_error.log"]


Thanks !


(Ryne Keel) #2

It sounds like you have two different clusters? If so, then I would try to find a post about running multiple filebeats from the same directory. Basically then you just start filebeat twice and point each one to a different config file.

So you would have something like syslog.yml and nginx.yml inside of the filebeat directory, and then in the modules .yml files just specify the access and error logs part like you already did, but put the kibana.host and elasticsearch.hosts settings inside of the configuration files in the filebeat directory