Hello everyone,
a question regarding elastic architecture; I have 100 GB per day ingestion.
If I want to keep the data 300 days in a frozen node without any replication; how should I estimate the disk storage?
I don't have a license so I can't use searchable snapshots; so the estimate is simply 300 days x 100 GB?
If you are not using searchable shapshots there is nothing magic that reduces the size of data on disk as it moves from zone to zone, apart from posibly the number of replicas. Your calculation therefore seems reasonable.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.