Hello there, I have a question regarding the cost my company would pay forelastic search cluster.
The company will receive 2 billion of data per year, in the future it is gonna be tough to handle them with single node elastic search. How many nodes I need to handle this amount of data, i mean I want a really measure. (or is it better to integrate big data architecture with this huge amount of data such as Hadoop with Spark)
May I suggest you look at the following resources about sizing:
https://www.elastic.co/elasticon/conf/2016/sf/quantitative-cluster-sizing
1 Like
thank you
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.