Duplicate messages created by FileBeat

'm using FileBeat to load log messages into ElasticSearch through LogStash. The log files are located on Windows network share. The FileBeat runs on Windows machine. The problem is that some log file records are duplicated. What is interesting is the fact that duplicate records are created in moments separated in time by event some hours. In log files of FileBeat and LogStash I don't see anything critical. I know that it is not recommended to load logs with FileBeat from network share. What could be the reason of duplicate messages created by FileBeat?

There is my filebeat.yml:

filebeat.inputs:

  • type: log
    • \\<host_name>\<folder>\*.<service_name>.*.2019*.log
    • \\<host_name>\<folder>\*.<service_name>.*.2019*.log...
      encoding: Windows-1251
      multiline.pattern: '^\d{4}-\d{2}-\d{2}'
      multiline.negate: true
      multiline.match: after

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

output.logstash:
hosts: ["<some_ip_address>:5044"]
enabled: true

processors:

  • add_host_metadata: ~
  • add_cloud_metadata: ~

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.