We are currently using Logstash and the input file plugin to ship logs from Docker to Elasticsearch. We are using Logstash 7.1 and the latest version of the plugin.
The problem we are running into is that after about 7-8 days, Logstash runs out of available file handles, because the process is never releasing file handles for deleted files. The log files for a container are cleaned up fairly quickly after a container has run, but Logstash keeps the file handle open. We currently allow 65536 file handles for the process, and after the 7-8 days, Logstash has consumed all of them, and throws a "Too many open files" error, and stops shipping logs.
We saw the option for close_older, and thought this would help, but the default should be enough in this case, and it does not seem to be closing the older files.
This is our input file configuration:
input {
file {
path => ["/var/lib/docker/containers/*/*-json.log"]
sincedb_path => "/etc/logstash/docker.db"
codec => json {
charset => "UTF-8"
}
}
}
Any ideas as to how we can resolve this issue, short of restarting Logstash every so often?