Hello,
Today I have a problema where I have to consume a nginx log, but this log file writes more lines than logstash can collect.
There's some way to have more performance in collect logs?
Hello,
Today I have a problema where I have to consume a nginx log, but this log file writes more lines than logstash can collect.
There's some way to have more performance in collect logs?
Logstash shouldn't have any problems reading from a file at a decent rate, but it's filters could be slowing it down. What kind of filters do you have? Which version of Logstash? Are your CPus saturated? Also, Logstash's input rate depends on its output rate. No matter how fast your filters are you're still in trouble of the outputs are blocked, e.g. because Elasticsearch can't receive the data fast enough.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.