ES 2.3.5 here. We use 50GB RAM nodes with heap size set to 50% + mlockall on openjdk 1.8.0_92.
I'm trying to figure out what occupies the heap.
I'm looking at the node stats of one of the data nodes and see that the used heap size is 18GB and I'm trying to understand what exactly it copmrises from. So far I see:
fielddata - 6GB
segments.memory - 1.5GB
segments.terms_memory - 1.3GB
These together are less than 10GB. All of the other memory related values I could find are less than 100MB, and I wonder where did the other ~9GB has gone?
you can check under the jvm section how much heap is in use - if this always much less than you configured, it makes sense to reduce the heap. This way the free memory can also be used by the file system cache for faster searches.
That's not what I'm trying to find out - my heap is 71% full and that's not the problem, What I would like to understand is what consumes it.
JVM section shows that majority of the memory is under old section. Again, I've checked fielddata size, segments, etc. and it just does not add up to the heap size.
you forgot the query cache (which is also in the stats), memory per thread, parent child, global ordinals, and many small contributed factors (script caches, many other simple caching structures). if you need to break it down, the simplest thing would be to take a heap dump and analyze that one, as not everything is easy to monitor and returned in the node stats - for example the data structures to create aggregations per request are not displayed, as they are short lived (however in more recent releases there is a circuit breaker for this).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.