I am trying to estimate my cluster size based on an excel sheet provided by elastic.co in a webinar in 20/05/2021 about "elastic sizing", I filled the daily volume data but I see that it gives me only 1 data not for "Hot", I am supposed to get at least 2 data node (1 for replica and 1 for saving data) even for low daily volume data. How to overcome this problem please?
Note: when I change the value of the cell B17 to 32, then it gives me 2 data nodes but I am not allowed to change such a parameter. That been said, my VM must have always 64 GB of RAM ? In case I can't provide a VM with 64 GB of RAM then can I change all values of row 17 ?
Yep the spreadsheet is just missing the MIN calculation set to 2.
The amount of storage you need will fit on a single node but the calculation should have the minimum of the number of calculated nodes or 2 assuming you have a replica probably just a oversight in the spreadsheet.
So you could probably have
2 hot
2 warm
The hot you can set to node RAM
32 GB RAM and it would probably show you the two nodes for hot.
And perhaps as low as 16 GB for the warm nodes.
Long story short if you want HA with a replica you need a minimum two nodes on the hot and warm tier
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.