Estimate RAM consumption per index

I would like to estimate RAM requirements for new node.
I plan keep on this node about 88 indices with 2 shards every. Currently one index have about 63GB on disk and 240 millions documents. I know Elasticsearch relies on heap space for storage of some data structures and working memory. I also know I should assign only half system RAM to Elastic heap and not exceed 20 shards per every GB of RAM.
For above data (90 indices) I calculated 24GB of RAM should be more than enough. But 24GB is disturbingly small in my opinion :thinking:.

Node will keep log entries from other system in indices (so time-based data). Search performance is not priority.

Is there any general rule to estimate RAM amount per index?

That depends on a bunch of things - data structures, query types, even Elasticsearch version and your hardware.

You should be fine to run with that amount, but without testing you won't know if it's enough or too much.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.