Consume many logs from file


Today I have a problema where I have to consume a nginx log, but this log file writes more lines than logstash can collect.

There's some way to have more performance in collect logs?

Logstash shouldn't have any problems reading from a file at a decent rate, but it's filters could be slowing it down. What kind of filters do you have? Which version of Logstash? Are your CPus saturated? Also, Logstash's input rate depends on its output rate. No matter how fast your filters are you're still in trouble of the outputs are blocked, e.g. because Elasticsearch can't receive the data fast enough.