ELK Baseline Capacity & Sizing for Production

Hi
Stack :- {File Beat, LS JDBC, LS File} --> Log Stash --> ElasticSearch --> Kibana

Now that I am done with the POC in the development server, need to plan for Production deployment.

Assumption:
5 GB data per hour would be shipped from multiple sources
Server Available for ELK - 8 core CPU, 16 GB RAM

I need to do the H/W sizing and benchmarking capacity of the stack and infact each individual module.
i.e., How many lines/how much GB or MB data that File Beat can ship to Logstash per sec/min/hour
How many lines/how much GB or MB data that LogStash can receive and index to ES (without data loss)
How much ES can receive and index the data
Is it better to store the data in local HDD or SAN !!
Recommended ES configuration such as nodes, shards/replicas..
... so on

Based on this can anyone guide or give pointers how to calculate the sizing or baseline capacity for ELK Stack. !!

Thanks

1 Like

Looking for this type of requirement. can anyone suggest any ideas for sizing the production boxes!!