Is there sizing/capacity calculators for ELK? Considering raw value, we are talking data ingestion of 280GB/per day. To get a high hourly rate, we are probably at 280/8=35GB per hour…Maybe 50 GB at our peak?
And we need to retain for 12 months….so total 102 TB. But does that seem feasible for ELK, knowing that we need to scale it out somehow? Is that high for log storage or reasonable? Should it be partitioned out?
I would like to suggestion from ELK expert, hoping to get some guidance.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.