I've just started using logstash 2.2.2 with 3 hosts uploading a combined average of 20,000 events per day. Using Filebeat 1.1 to forward events. Typically after about 12 hours I will get the OutOfMemoryError in the logstash log. I've increased the HEAP size to 2GB and it tends to last a little more than 24 hours.
What is a typical logstash heap size and where is the best place to start troubleshooting this?
I have copied the output statement below. It turns out to be related to using an IF statement in the output section of logstash. I wanted to have separate indexes based on the source type (set as a the prefix "myapp"), however when this put in place, I get what appears to be a memory leak. When I remove this and go with a static index name, logstash does not run out of memory.
Is there a better way to separate indexes based on type?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.