Filebeat too many open files error when number of log files bigger than 1024


#1

Filebeat Version : 1.2.3
OS: CentOS 7 (Docker)
Error Msg: 2016-09-01T08:20:48Z ERR Stop Harvesting. Unexpected file opening error: open xxx: too many open files

ulimit -a | grep "open files"
open files (-n) 65536

/etc/security/limits.conf:

  • soft nofile 65536
  • hard nofile 65536

/etc/sysctl.conf:
fs.file-max = 655360

as 31498 is the pid of filebeat:
cat /proc/31498/limits | grep "Max open files"
Max open files 1024 4096 files


(ruflin) #2

Just two days ago we introduce a config option to limit the number of open harvesters (=file handlers): https://github.com/elastic/beats/pull/2417 If you would like to test this you can use the nightly builds here: https://beats-nightlies.s3.amazonaws.com/index.html?prefix=filebeat/


#3

Thanks for your answer.

But I want figure out why filebeat can only open 1024 files for harvesting? Is there someway to unlimit it? I changed system limit settings, but it had no effect on filebeat process.


#4

Finally I got the solution, in CentOS 7 / RHEL 7, SysV is replaced by Systemd, and as aresult modifying /etc/security/limits.conf has no effect on service of systemd. I modified /usr/lib/systemd/system/filebeat.service to change the open file limit:

[Service]
LimitNOFILE=100000


(ruflin) #5

@vin Glad you found a solution and thanks for posting it here.


(system) #6

This topic was automatically closed after 21 days. New replies are no longer allowed.