Logstash too many open files error

Hello there,
We have set up ELK on a single VM. Kibana was not able to display desired logs for a period of 10 days.
So the error seemed to be with logstash as per error logs. The error was dial udp x.x.x.x:53: socket: too many open files.
So we tried increasing the max open file limit for the process but were unable to do so, the allowed open file limit as per the command cat /proc//limits was 1024 (soft limit) and
4096(hard limit).

Even after multiple attempts to have the open file limits increased for logstash, we were not able to replicate the issue/reason .

We were able to successfully increase the limit for elasticsearch process, but that didn't resolve the issue though.

So one of my questions would be,is it possible to have the limit increased at the process level (logstash level), if yes then can u explain how that can be done. If not any other measures we can look into.
Or could it be totally something else we are not looking into.

Any inputs or suggestions

Kaushik Nambiar

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.