Hi, when I use filebeat to collect the log data. I found that the output file have many dumplications of a log entry. The log data is generated by a python script. Every log entry has a unique number and every 300 entries will trigger a logrotate. First, I start the filebeat, then run the script to generate the log. At last, I found that the 1-300 entries occuer once, 300-600 entries occure twice, 600-900 entries occure three times and so on.
following is my test data:
python script: https://github.com/shanmaoye/filebeat-test/blob/master/logrotate_test.py
filebeat debug level log: https://raw.githubusercontent.com/shanmaoye/filebeat-test/master/beat.log
the output of the filebeat: https://raw.githubusercontent.com/shanmaoye/filebeat-test/master/filebeat
filebeat: prospectors: - paths: - /var/log/shanmao/shanmao.log - /var/log/shanmao/shanmao.log.1 input_type: log exclude_lines: ["^$"] output: file: path: "/tmp/filebeat" filename: filebeat rotate_every_kb: 100000 number_of_files: 7 shipper: logging: level: debug to_files: true files: path: /var/log/mybeat name: beat.log rotateeverybytes: 10485760
my system info: ubuntu_14.04_64bit, version of the filebeat is 1.2.3 I wonder why does this happen? Is my configuration wrong? Thanks.