Need help to find ELK config required for 30 lakh logs in one minute

Hello,

We are having servers in our system which are going to produce around 30 lakh log lines every minute. These logs need to be inserted into ES through logstash.
We have some regex's written in logstash. Logs are pushed to logstash from filebeat.
Can someone please suggest us how many nodes/replicas/shards will be required for such a project.

Thanks in advance.

-Sumeet

This will depend a lot on the use-case, size and complexity of documents, retention period, the mappings you use as well as the type of hardware you are going to deploy on. There is no way to estimate size based on the data you have provided. The best way to find out is to benchmark, and you may find the following resources useful:

https://www.elastic.co/webinars/using-rally-to-get-your-elasticsearch-cluster-size-right

https://www.elastic.co/elasticon/conf/2016/sf/quantitative-cluster-sizing

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.