Issue in filebeat when file got rollover and deleted

Hi all.,
We are using filebeat to push log to elastic search via logstash. Under heavy load the log is rolling over for every half an hour later once it crossed the number of files specified in log4j ( for say max rollover number for file is 100) the older files are getting deleted by log4j.
But our filebeat service is still keep on holding those files as open even though all log are moved to elastic search and also the files got deleted in the file system. So this is leading us to filesystem space issue.

EX: filebeat 30507 root 274r REG 253,8 52429356 1444109 sample.log
filebeat 30507 root 275r REG 253,8 52450263 1443105 sample1.log(deleted)
filebeat 30507 root 276r REG 253,8 52428841 1445076 sample2.log(deleted)

Any updates on this. I am completely blocked because of this issue.

The close_older option should be what you're looking for. What's somewhat surprising is that it defaults to 60 minutes, so with a 30 minute rotation you shouldn't see more than two files open at the same time.

Any updates on this. I am completely blocked because of this issue.

Do not expect people to answer questions during weekends.

Which filebeat version are you using? Can you share your config file?

This topic was automatically closed after 21 days. New replies are no longer allowed.