Great, is it possible to see where the heap maxes out? Like increasing it from 3.5G by adding 500MB until it crashes? I am interested to see what is the maximum heap size allowed.
OK this is probably either the system takes some cache and won't allow the JVM to take the heap or the host which your VM is on it won't give you exactly 11GB.
Do you manage the Hypervisor? Can you look at the VM configs?
With what the system takes from memory I thought 4G should be OK. Then the only thing I can say is either system reserve some memory apart from what it uses which makes it around 4GB and it gives you less than 3.5G or the host is not really giving you 11G.
So you think either the VM is not correctly configured (the system host has not that much RAM) or the linux system is taking the RAM so I can not add RAM to the elasticsearch JVM.
That is the only conclusion I can come up with. Unfortunately, when the memory is that low it is hard to separate kernel memory/cache from free memory.
I really don't think this has anything to do with Elasticsearch. This is all JVM trying to access and reserve the amount of memory you assign and for some reasons it can't get a hold over that amount.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.