Too many open files in Elasticsearch

(Constantino Matias Salvador) #1

Hello, I have this problema in Elasticsearch, can't solve.

[2016-01-14 10:09:34,789][WARN ][] Failed to accept a connection. Too many open files
at Method)
at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$
at java.util.concurrent.ThreadPoolExecutor.runWorker(

I use this input in configuration of Logstash

input {
file {
type => 'apache'
path => '/etc/httpd/logs/*'


Use this, because i want read file access logs of apache, this files "rotate" each day.

Help me please !!!

(Adrien Grand) #2

You could use lsof to figure out who the culprit is. It could be either be logstash or elasticsearch.

(Constantino Matias Salvador) #3

Isof ???

I not understand┬░┬░ : (

(Constantino Matias Salvador) #4

ok, i use lsof and i have this

Is a file with 2000 lines, it est. Have 2000 open files.

You know why have too many files open???

(system) #5