I have a special log rolling strategy. The application keeps writing logs to file f. When file f reaches the rolling threshold, f is renamed to f1, the original f1 is renamed to f2, and so on. Recently, we found There are a lot of duplicates in the logs collected by filebeat, which is suspected to be caused by this log rolling strategy, but there is no specific reason for the duplicate logs. Does anyone know the cause of the duplicate logs?
Could you please explain where do you see duplicates? Did check their size and content too? "ls -la" ?
Sorry for not responding in time. My program writes 5 pieces of data in a file, and then performs log scrolling. What I have seen so far is: When filebeat repeatedly collects data, it always collects all data of the file repeatedly Instead of several pieces of data.
After several tests, I found that the file has inode reuse. But after a series of analysis, I found that it is not the inode reuse that caused the filebeat to repeatedly collect logs. In addition, my configuration is a basic File, and I wrote the log collected by filebeat to another file.I analyzed this file and found that filebeat has repeatedly collected.
filebeat.inputs:
- type: log
paths:
- /home/xxx/log/*.log
output.file:
path: "/home/xiongjunkun/filebeat"
filename: filebeat
rotate_every_kb: 1073741824
number_of_files: 16
permissions: 0600
logging.level: debug
logging.selectors: []
logging.to_stderr: false
logging.to_syslog: false
logging.to_eventlog: false
logging.to_files: true
logging.files:
path: /home/xxx/filebeat_log/
name: filebeat_log.log
rotateonstartup: true
rotateeverybytes: 104857600
keepfiles: 64
permissions: 0600
I encountered the same problem
Has your problem been solved?
hi
Are there any new discoveries?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.