Since we are preparing to start a new project based on ELK I would be glad if some of you could share the details of the largest ELK cluster you've seen or worked with (How many nodes? How many of them are data nodes? What type of hardware was required for every node type (CPU, RAM, disks)? How many GBs are ingested per day? How frequent and how extensive are searches on this cluster? How does the cluster perform?)
Hey @Yuval_Khalifa, have you seen the videos and slides from Elasticon? Lot's of interesting information, stats and metrics about cluster sizes, numbers of documents, amount ingested per day ,etc. For example, FireEye has
3.6 Petabytes of raw storage across 40 clusters
700 billion indexed events in 400+ nodes
300,000 events per second
The talk goes into detail about what infrastructure they are running this on.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.