Logstash node has 16GB RAM and max and min heap size has been set to 8GB.
Seeing the following error when testing the logstash pipeline. What is the recommended setting to avoid this issue?
Less than 8GB for min and max heap?
OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000005d4cc0000, 8241020928, 0) failed; error='Cannot allocate memory' (errno=12)
There is insufficient memory for the Java Runtime Environment to continue.
Native memory allocation (mmap) failed to map 8241020928 bytes for committing reserved memory.
An error report file with more information is saved as:
Sure. Thanks. What is the default heap size? Not sure if the default heap size setting is sufficient which is less than 8GB why the above mentioned error is showing up. This is dedicated logstash node. After restarting the logstash it is fine now. Not sure what has caused this issue.
Not sure if the default heap size setting is sufficient which is less than 8GB why the above mentioned error is showing up.
The error message doesn't mean "Logstash has run out of heap, increase the heap" it means "Logstash can't allocate memory for your heap, descrease the heap (or fix whatever is limiting the allocation)".
This is dedicated logstash node.
It's very unlikely that you'll need anything close to 16 GB RAM just to run Logstash.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.