Too many open files in Elasticsearch

Hello, I have this problema in Elasticsearch, can't solve.

[2016-01-14 10:09:34,789][WARN ][netty.channel.socket.nio.AbstractNioSelector] Failed to accept a connection.
java.io.IOException: Too many open files
at sun.nio.ch.ServerSocketChannelImpl.accept0(Native Method)
at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:241)
at org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss.process(NioServerBoss.java:100)
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
at org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

I use this input in configuration of Logstash

input {
file {
type => 'apache'
path => '/etc/httpd/logs/*'
}

}

Use this, because i want read file access logs of apache, this files "rotate" each day.

Help me please !!!

You could use lsof to figure out who the culprit is. It could be either be logstash or elasticsearch.

Isof ???

I not understand°° : (

ok, i use lsof and i have this

Is a file with 2000 lines, it est. Have 2000 open files.

You know why have too many files open???