Auditbeat scan rate setting module: file_integrity

My goal is to get the exact number of logs along the path specified in auditbeat in elastic to build a schedule for receiving logs by day in kibana.
The problem is that I get several duplicates of the same log in elastic. Auditbeat manages to track changes in the log file and send several messages through it.
The multiline log file consists of a request and a response to a request, after receiving a response, the state of the file does not change, the response comes in a few fractions of seconds:

2021-12-09 16:52:24.6435|INFO|logfile|Запрос:
2021-12-09 16:52:24.6747|INFO|logfile|Ответ:

Help to adjust the frequency of scanning files to avoid duplicates.
Or maybe there is a way how to attach the generated hash of the file as its id in elastic?

My auditbeat config:

auditbeat.modules:
- module: file_integrity
  paths:
  - C:/xml_logs/GT/xml/
  scan_at_start: true
  recursive: true
setup.template.settings:
  index.number_of_shards: 1

tags: ["xml"]

output.logstash:
  hosts: ["10.1.1.4:5044"]

logging:
  to_files: true
  files:
    path: C:/ProgramData/auditbeat/Logs
  level: debug

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.