Which is better RAM allocation strategy?

If I have a data node with 128GB of RAM. Is it better to allocate 64GB to ES and 64GB to system?
Or would it be ok (or even better) to allocate say 100GB to ES and leave 28GB to system?
Our system is write heavy; therefore, the idea about second approach (instead of 50-50).

I know above 32GB will disable compressed pointer, so I need at least 48GB to get back to the same usable RAM. Since I have so much RAM, would it be better to allocate say above 96GB to ES?

Thanks in advance.

What version are you running?

7.15.2 currently.
But will be upgrading to 8.7.1 very soon.

G1GC is better with larger heaps, but it doesn't remove the need for GC entirely.

The best way to approach this would be to test it out with your setup.

Thanks.
Sure. I will always test it. The reason for this question is to see if there's any insight from ppl who might know something. Or done something with similar situation.
From your answer I gather the amount of system RAM requirement is simply depending on how much query of "older" (not recently written) documents.
If my write to read ratio is really lopsided like 99% to 1%, then too much system RAM is wasteful.

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.