Here we plan to create cluster of elk. Our current sceranio is;
Daily data size would be 500 GB;
Monthly around 15TB
Data are generated and pushed to index.
we need cluster of nodes and hardware requirements for this above scenario;
Please help me how would i plan based on our requirement details;
I suspect you have a type there as 500GB raw data per dat would result in about 15TB raw data per month, not 1.5TB. One important aspect when sizing a cluster is how long you are going to keep your data in the cluster as this drives the total amount of data that need to be stored. Please also have a look at this blog post around sizing logging and metrics clusters. It links to most of the additional resources I recommend on the topic.
Thanks for the response christian!!!. Yeah we kept data around 1 month, we need to process the data machine learning prediction and future purpose. Can please give me some advice, i dont know, how to assign number master nodes and data nodes;
pls help me;
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.