How to properly use two Filebeat modules listening on the same port?

I currently have Fortinet and Cisco modules enabled on the same filebeat instance, and have a cisco meraki network device sending syslogs as well as fortinet firewall logs to the same port, 5514. I am using Docker with an ES, Kibana, and Filebeat stack with Filebeat sending the logs directly to ES. Is there any way that I can configure filebeat such that it knows when to use the right module to properly parse the data coming from either Fortinet Firewall or the Cisco meraki?


    path: ${path.config}/modules.d/*.yml
    reload.enabled: true

  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_tags:
#      when:
#        equals:
#         source.ip: REDACTED
      tags: [fortinet]
      target: "event.module"

  hosts: ['']
#  indices:
#    - index: "filebeat-syslog-fortinet-%{+YYYY.MM.dd}"
# "fortinet"
#      setup.template.pattern: "fortinet-%{[agent.version]}"
#      when.equals:
#        event.module: "fortinet"
#    - index:  "filebeat-syslog-cisco-%{+YYYY.MM.dd}"
# "cisco"
#      setup.template.pattern: "cisco-%{[agent.version]}"
#      when.equals:
#        event.module: "cisco" ''

#- type: syslog
#  format: auto
#  protocol.udp:
#    host: ""


# Module: fortinet
# Docs:

- module: fortinet
    enabled: true

    # Set which input to use between tcp, udp (default) or file.
    var.input: udp

    # The interface to listen to syslog traffic. Defaults to
    # localhost. Set to to bind to all available interfaces.

    # The port to listen for syslog traffic. Defaults to 9004.
    var.syslog_port: 5514

    # Set internal interfaces. used to override parsed network.direction
    # based on a tagged interface. Both internal and external interfaces must be
    # set to leverage this functionality.
    #var.internal_interfaces: [ "LAN" ]

    # Set external interfaces. used to override parsed network.direction
    # based on a tagged interface. Both internal and external interfaces must be
    # set to leverage this functionality.
    #var.external_interfaces: [ "WAN" ]


# Module: cisco
# Docs:

- module: cisco
...redacted for brevity
    enabled: true

    # Set which input to use between udp (default), tcp or file.
    var.input: udp
    var.syslog_port: 5514

    # Set paths for the log files when file input is used.
    # var.paths:

    # Toggle output of non-ECS fields (default true).
    # var.rsa_fields: true

    # Set custom timezone offset.
    # "local" (default) for system timezone.
    # "+02:00" for GMT+02:00
    # var.tz_offset: local

You can see in the filebeat.yml I have some potential solutions that I have tried but to no avail. For instance, I've tried to force fortinet to be used by replacing the "event.module" tag on every event, just to see if I could. Doesn't look like that field is mutable though, and it seems like Filebeat just uses the first configured module on the port to parse all events coming in on that port.

I believe the question here asks the same or a very similar question, but the solution was never resolved and both the asker and answerer were both a bit unclear. Any help would be appreciated if this is possible at all. Also, if there is a more efficient way of achieving something like this by using logstash or something, please enlighten me as I am new to ELK. Thanks :slightly_smiling_face:

Have you considered logstash to do the parsing instead?

Hi Sunile,

Thanks for your response. I guess my questions are, is this possible to do in Filebeat? and, if using Logstash for parsing, are you implying that it would still be a good idea to include Filebeat in my stack? If so, before or after Logstash, and what would be the point in adding Filebeat to that process?

First I recommend using elastic agent over filebeat and allow fleet to manage. EA/Filebeart adds metadata to the payload which in practice i've found to be useful.

sending events directly over syslog on logstash is def in range of common patterns for ingestion.

Each module in filebeat will spin up a listener on the configured port, so it is not possible to use the same port for both modules.

To use the same port you would need to use logstash and find a way to filter the messages based on some content and direct those messages to different outputs.

Thanks Leandro, I figured as such.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.