Logstash Sizing

What is the recommended typical sizing configuration with respect to memory, disk and cpu for logstash.

That depends on how long your rope is.

Jokes aside:

  • The CPU needs depend on how many events per second you're going to processs.
  • 1-2 GB RAM should be plenty.
  • Logstash persists very little information to disk by itself, so you basically only need space for the program files and the log files it produces.

Thanks. Around 100 million documents per day.we are considering 3 logstash nodes reading from kafka.

100M events/day is only about 1200 events/second. Your events will probably not be evenly distributed over the day so in reality I guess you'll have to deal with higher loads. On the other hands, with Kafka as a buffer you'll be able to cope with spikes just fine assuming you can live with latency in downstream event delivery.

A single-core machine can process hundreds of events per second so three machines can probably handle your load just fine. But don't take my word for it; measure yourself with the filters that you're going to use.

1 Like

Sure. Thank you. This is very helpful.

The filters you have setup can heavily change your requirements.
We have some with logstash configurations that are hundreds of lines long. Lots of mutate, grok, and other expensive operations. Those 4 core machines can only do 2500 events per second. But we have other 4 core machines with almost no filters and they can do at least 4 or 5 times that amount.

1 Like

Thanks Brandon for the additional information.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.