Consume a large of data in a file with Logstash

(Jeferson Martins) #1


I have a logstash that have consume nginx logs and another applications in multiples input file in logstash configuration, but when we increase the acess in the website, we have a lot of access and the logstash delay to collect the information and we have a role in dashboard.

My redis queue still clear all the time.

Someone has a tip about this?

(Magnus Bäck) #2

How many messages is Logstash processing per second? Do you have any time consuming filters? How's the CPU load?

(Jeferson Martins) #3

In my index have 109090727 documents. So we have 2525 by second.

I have many filters with several business rules in logstash what read the logs files.

(Jeferson Martins) #4

the problem happens when we have a lot of access in nginx :frowning:

(Jeferson Martins) #5

Another information is I have only one logstash to consume several logs not only the nginx log

(Magnus Bäck) #6

Well, it sounds like you might need to profile and optimize your filters a bit. If you show us what you've got we might be able to help out. To increase throughput make sure you saturate your CPUs, e.g. by increasing the number of filter workers with the -w startup option.

(Jeferson Martins) #7

We talk about and we will make some changes in Logstash Log Reader and in the Logstash Indexer.


(system) #8