Hi all.,
We are using filebeat to push log to elastic search via logstash. Under heavy load the log is rolling over for every half an hour later once it crossed the number of files specified in log4j ( for say max rollover number for file is 100) the older files are getting deleted by log4j.
But our filebeat service is still keep on holding those files as open even though all log are moved to elastic search and also the files got deleted in the file system. So this is leading us to filesystem space issue.
EX: filebeat 30507 root 274r REG 253,8 52429356 1444109 sample.log
filebeat 30507 root 275r REG 253,8 52450263 1443105 sample1.log(deleted)
filebeat 30507 root 276r REG 253,8 52428841 1445076 sample2.log(deleted)