Too many open files

Yes, I using elasticsearch on a single node, and it looks like the number
of file descriptors exceeded the maximum allowed limit on this machine :frowning:

The "_nodes/process?pretty" writes that the maximum_file_descriptor is
65535. It seems the jvm only can use 65535 files at once, or could I get it
to use more ? If I can't then I have to add a new node ? or there is an
another option ?

Thank you in advance for your answer.

On Thursday, November 7, 2013 4:12:43 PM UTC+1, Jörg Prante wrote:

It looks like you run a single node? If so, you have too many shards for a
single node. 1040 shards is unusual high. Note that each shard can consume
around 100-200 file descriptors while being active in the background, so it
is quite possible you have exceeded 65510.

Check curl localhost:9200/_nodes/process?pretty or OS utilities for the
number of used file descriptors of the JVM.

Jörg

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.