I have provisioned 300 GB of Hard disk storage to Elastic search stack. Is there a way to measure how much storage is consumed by ES service, as I need to work on Capacity planning. For example, how much memory and storage to be provided to ES service? I am pushing all the application logs to Elastic Stack. How many logs data can be stored in Elasticsearch service for a 300 GB of Hard disk storage?
1 GB data in logs or pure JSON ready for ES insert is not 1 GB on disk. This is related to the Lucene engine and ES indexing. Also, it's possible to use the compression which give more space and reduce performances, roughly 10% =>240 GB
Replica shards minimum is 1 for fault tolerance - 50% => 120-125 GB
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.