Hi,
I have some Logstash that read Log files with some GBs.
Some times this logstash stop to send logs to my redis and when I see, my log files have some Gbs.
I using Logstash 1.4.2. There is some way to resolve this in Logstash?
Hi,
I have some Logstash that read Log files with some GBs.
Some times this logstash stop to send logs to my redis and when I see, my log files have some Gbs.
I using Logstash 1.4.2. There is some way to resolve this in Logstash?
could you pls elaborate
when I have big log files, my Logstash(s) have GC problems
What's your config look like?
Can you upgrade to 2.2?
Only read log file and ship to Redis.
Today I use Logstash with Docker and I dont test the version 2.2 yet.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.