I'm trying to learn some rules of thumb for when or if it's ever ok to increase min and max heap values for ES to > 50% of physical memory.
This page recommends setting min and max heap to the same value, and setting both to be > 50% of the available RAM on the box in order to let the OS cache things (I think for Lucene?): https://www.elastic.co/guide/en/elasticsearch/reference/current/heap-size.html
My team recently ran into a problem where sometimes a node would have long GC pauses but be unable to free anything from old gen. That would seem to imply that the node in question was still using that memory for something. (Or possibly some way in which we are using ES ran into a memory leak.)
The total memory use on the node was only about 60%, so it would seem reasonable to try increasing ES's heap by a few GB, but I'm currently a bit scared off by the fact that it's against general recommendations. The specific numbers that we are currently using are 32 GB ram, with min and max heap set to 16GB.
Does the 50% physical ram recommendation apply more to machines with smaller amounts of physical memory?
And related, does high ES heap but otherwise low system memory usage point to anything weird about what might have been happening in terms of why oldspace could not be freed up in this specific context?