Sizing/Capacity for ELK in Production

Is there sizing/capacity calculators for ELK? Considering raw value, we are talking data ingestion of 280GB/per day. To get a high hourly rate, we are probably at 280/8=35GB per hour…Maybe 50 GB at our peak?

And we need to retain for 12 months….so total 102 TB. But does that seem feasible for ELK, knowing that we need to scale it out somehow? Is that high for log storage or reasonable? Should it be partitioned out?

I would like to suggestion from ELK expert, hoping to get some guidance.

May I suggest you look at the following resources about sizing:

https://www.elastic.co/elasticon/conf/2016/sf/quantitative-cluster-sizing

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.