Caused by: java.lang.OutOfMemoryError: unable to create new native thread
processors: 320
How many cores are on this machine? This is the root cause of the issue and I will be surprised if any version of Java is able to properly handle the number of native threads that you are implicitly trying to create. My guess is that you can simply unset that setting and things should go back to working.
If this box actually has 320 logical cores available to it, then you should use containers or VMs to carve up the box into multiple nodes rather than dedicate 320 cores to a single node. The JVM is simply unable to sustain that. An example of this type of failure can be seen here. Fortunately the JVM has improved since then, but not to 320 cores (the default max is now the number of available processors, but it used to be 24 and then 32; by setting the value explicitly you control its value).
I've used this setting for a long time now with previous versions. Maybe your right in that it causes problems with Java8-131, maybe because it should and is now enforced in some way.
I guess I misread the Elasticsearch documentation in thinking more threads can boost performance on otherwise underutilized CPU resources, I believe it was recommended to try doubling the value or even more but I cannot find the reference now.
The boxes have 20 core / 40 thread count. From trial and error I reached that value, I think it related to issues with index and the bulk_queue pools. The default values would hardly reach 5k EPS and from trial and error I've been able to boost the performance up to 25k - 50k EPS.
Maybe it's time to re-test and take better notes. There are a myriad of variables to track when tuning for performance.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.