Elasticsearch Sizing in Petabyes

Hi All,
Need your help/suggestion in sizing the ELK in Petabytes.

my daily log ingestion volume is 10 TB/per day.and we would data retention in the following :slight_smile:
3 months Hot Data.[hot nodes]
1 year warm Data[warm nodes]
4 year Cold Data retention .

can you please help in sizing as Log format would be IT Infrastructure and Application Logs.
how many cluster do i need?is one cluster can hold these much data?
most important in one Node how much storage can be allocated if i have 128 GB RAM 19 TB Disk(SSD) 12 core cpu for Hot Node.

May I suggest you look at the following resources about sizing:

https://www.elastic.co/elasticon/conf/2016/sf/quantitative-cluster-sizing

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.