We have build a logging system using Elastic Stack which is currently ingesting log data from different applications for monitoring. We have plans of adding more applications to our logging system in the future. Only one Logstash instance is running under a docker container, and this docker container(only one) is running in an independent Virtual Machine. Logstash after transformation sends the logs to Elasticsearch hosted on a different server. My question is what is the maximum load which a single logstash instance can handle(once we scale up) and what happens when the threshold is reached(Is the max threshold configurable). Is it possible that we can deploy mutiple dockerized logstash instances and route the traffic of incoming log data to specific logstash instances based on their current load capacity. What is the best approach to go with once the size of the log data coming from multiple sources increases exponentially as we integrate more and more applications to our logging system ?
What your events look like, what your processing config does, what the system resources you have are, what version you are on, what JVM.
If you really want to know then you should be doing some testing with your data source.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.