Consume a large of data in a file with Logstash

Hi,

I have a logstash that have consume nginx logs and another applications in multiples input file in logstash configuration, but when we increase the acess in the website, we have a lot of access and the logstash delay to collect the information and we have a role in dashboard.

My redis queue still clear all the time.

Someone has a tip about this?

How many messages is Logstash processing per second? Do you have any time consuming filters? How's the CPU load?

In my index have 109090727 documents. So we have 2525 by second.

I have many filters with several business rules in logstash what read the logs files.

the problem happens when we have a lot of access in nginx :frowning:

Another information is I have only one logstash to consume several logs not only the nginx log

Well, it sounds like you might need to profile and optimize your filters a bit. If you show us what you've got we might be able to help out. To increase throughput make sure you saturate your CPUs, e.g. by increasing the number of filter workers with the -w startup option.

We talk about and we will make some changes in Logstash Log Reader and in the Logstash Indexer.

:smile: