Hello ,
I am planning to create an Elasticsearch Cluster for logging and metrics purpose
I am using time-based indexes
I want to calculate the optimal data nodes and shards this cluster requires
The logs reach a maximum of 150GB per day
I am referring to this blog to specify my cluster requirements Benchmarking and sizing your Elasticsearch cluster for logs and metrics | Elastic Blog
But I didn't know what the value of memory data ratio should be
If you have any recommendations for the value of memory data ratio in my use case to help me calculate the required data nodes
Thanks in advance