I have a problem. My JVM logstash only has 512M in it configurations:
-Xms256m
-Xmx512m
Logstash read messages from mylog.json and send them to my broker. BUT when a super large message is written in my mylog.json the logstash JVM throws a java.lang.OutOfMemoryError: Java heap space. If I increase the JVM memory this works fine BUT I can not increase more memory.
You could try to decrease the number of events that logstash process adjusting both the number of workers and the batch size.
Per default the pipeline.workers is set to the same number of CPU cores in your host and the pipeline.batch.size is set to 125, you will need to change those values until you find a combination that won't give you OOM errors.
You can find more information about those settings in the documentation.
You should reduce both, but there is not much else you can do as 512 MB is already small.
You can try to use persistent queues, this way logstash will not use the in-memory queue for the events, this part explains how to configure the persistent queue.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.