Hello,
This is my current setup.
-
Filebeat running in a docker container reading json files from a mounted docker volume.
-
These files are only read once and then close.
close_eof = true
-
Files are written to the docker volume in bursts of many files per second.
-
Data sent to a hosted elastic cluster, using
cloud.id
option -
I have a script that reads filebeat registry and then remove the file after is completely processed.
Problem:
Some files are never sent to elastic and they start accumulating in the docker volume, after doing some investigation, this is what's happening
-
Those files get completely harvested by filebeat, confirmed by looking at filebeat log
-
Files are no longer in filebeat registry (meaning again harvested done and sent confirmed?)
-
Like I said, file never show up in elastic and never gets deleted by custom script
-
Restarting filebeat fix the issue, file is harvested again and then sent.
Any idea of what could be happening ?
Current filebeat config
filebeat.inputs:
- type: log
fields.type: 'stats'
enabled: true
close_eof: true
json.keys_under_root: true
paths:
- /Logs/stats/*.stats.json
harvester_limit: 150
This happen only to some files