Relate events generated to size

If a Logstash node is processing 5000 events per second, how can I relate it to size in terms of MB or GB ?

The size depends on the size of the log that it is processing.

It appears that the schema size is around 828 Bytes per line in a log file. So you could do a baseline off of that. But it depends on the size of the events that you are processing.
If you are processing stack traces, they will be quite a bit larger than say a dmesg. So it entirely depends on your data.

we have around 120 VM, which are running various applications, databases, few of these are also webservers. Around 2000 users are accessing these applications. all the application logs including system logs are processed by logstash. Would it be a safe assumption to say 1MB per log/event which come to 5000 MB/sec. We are talking 5GBps is processed by Logstash to Elasticsearch ?

1 MB might be a little big.
One line of my dmesg sits at 1.2K
One line of a zip file that I accidentally processed is 578 Bytes.
One line of a custom application is 1.8K

So if you say 1 MB you will definitely overshoot anything that you are probably processing. But just to clarify, when you are saying log/event, you are referring to a single line in a log file, correct?
If you are referring to the size of the entire log file, then I would suggest taking the total size of the log file and multiplying it by 4.8.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.