Hi All,
Need your help/suggestion in sizing the ELK in Petabytes.
my daily log ingestion volume is 10 TB/per day.and we would data retention in the following
3 months Hot Data.[hot nodes]
1 year warm Data[warm nodes]
4 year Cold Data retention .
can you please help in sizing as Log format would be IT Infrastructure and Application Logs.
how many cluster do i need?is one cluster can hold these much data?
most important in one Node how much storage can be allocated if i have 128 GB RAM 19 TB Disk(SSD) 12 core cpu for Hot Node.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.