Filebeat output to multiple ElasticSearch

Hi,
I have a server RHEL7 with filebeat client installed. I want to send the syslog to a cluster of ElasticSearch (elk01) and the logs of Nginx to another one (elk02). What's the config for that. The first one wotk great. But I don't know how to configure the module for Nginx.


  • module: nginx
    setup.kibana.host: ["elk02:5601"]
    output.elasticsearch.hosts: ["elk02:2900"]

    Access logs

    access:
    enabled: true
    var.paths: ["/var/log/nginx/*_access.log"]

    Error logs

    error:
    enabled: true
    var.paths: ["/var/log/nginx/*_error.log"]


Thanks !

It sounds like you have two different clusters? If so, then I would try to find a post about running multiple filebeats from the same directory. Basically then you just start filebeat twice and point each one to a different config file.

So you would have something like syslog.yml and nginx.yml inside of the filebeat directory, and then in the modules .yml files just specify the access and error logs part like you already did, but put the kibana.host and elasticsearch.hosts settings inside of the configuration files in the filebeat directory

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.