I would like to estimate RAM requirements for new node.
I plan keep on this node about 88 indices with 2 shards every. Currently one index have about 63GB on disk and 240 millions documents. I know Elasticsearch relies on heap space for storage of some data structures and working memory. I also know I should assign only half system RAM to Elastic heap and not exceed 20 shards per every GB of RAM.
For above data (90 indices) I calculated 24GB of RAM should be more than enough. But 24GB is disturbingly small in my opinion .
Node will keep log entries from other system in indices (so time-based data). Search performance is not priority.
Is there any general rule to estimate RAM amount per index?