I have a problem. My JVM logstash only has 512M in it configurations:
Logstash read messages from mylog.json and send them to my broker. BUT when a super large message is written in my mylog.json the logstash JVM throws a java.lang.OutOfMemoryError: Java heap space. If I increase the JVM memory this works fine BUT I can not increase more memory.
What can I do?
You could try to decrease the number of events that logstash process adjusting both the number of workers and the batch size.
Per default the
pipeline.workers is set to the same number of CPU cores in your host and the
pipeline.batch.size is set to
125, you will need to change those values until you find a combination that won't give you OOM errors.
You can find more information about those settings in the documentation.
Thank you for your reply. Should I increase workers and batch size or reduce?
Ive tried increasing and reducing this values and the same error appears. What else can I try?
You should reduce both, but there is not much else you can do as 512 MB is already small.
You can try to use persistent queues, this way logstash will not use the in-memory queue for the events, this part explains how to configure the persistent queue.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.