Heap settings for 128GB (RAM) server

What is the recommended RAM settings for 32 core 128 GB RAM. I do understand that it shouldn't be more than 30GB to make sure is below the JVM cutoff for compressed object pointers.

I would recommend you to go through the below mentioned link.


A couple of pointers to remember, Heap memory shouldn't be more than 50% of the physical memory available. You can than experiment out with your scenario. Too small heap memory will cause frequent garbage collection where as too large heap memory will cause the garbage collection operation to slow down.

It mainly depends on your use case, the amount of data you have, the queries you make, etc. To sum it up, you don't set the JVM heap depending on how much physical RAM you have, but instead depending you how much you need for your specific use case. And this precise amount can only be found out by testing using a representative set of data from your specific context.

Yes. Sure. Thanks.

But my concern is even though more RAM is available looks it is not recommended to cross 30/32 GB due to the challenges involved in having the oops/compressed oops.

So my question is whether we can have it (min and max heap) set to 64GB (50% of allocated 128GB) and have it working optimally. Will try it out, just wanted to know whether this is a valid configuration and whether the other elasticsearch installations have this kind of setup as well.

I would clearly not set 64GB of heap. Having more is not necessarily a good thing as more heap means longer GC cycles.

But again, you should really figure out how much heap you need based on your real usage. Maybe you only need 8GB, who knows.

It is also worth noting that Lucene doesn't go through the heap, but maps its files directly into memory, so whatever memory you don't use for your heap, Lucene will still happily be able to use it.

1 Like

Awesome. Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.