I am getting Heapdump out of memory error on my Production.. DUe to which i am getting incosistant data on kibana

Can you please help me to solve this..
I will send you error messages
It's Instance type is t3a.medium

in usr/share/logstash output is :

OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x000000008a660000, 1973026816, 0) failed; error='Cannot allocate memory' (errno=12)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 1973026816 bytes for committing reserved memory.

An error report file with more information is saved as:


Frre -h
total used free shared buff/cache available
Mem: 3.7G 3.3G 132M 41M 182M 85M
Swap: 1.0G 249M 774M

top command

please help

To find the problem you should take a look at the heap dump in a tool like MAT. If you find yourself spending more than a minute trying to work out what is using all the memory then give up. It should be front and centre on the main page. -- leak candidate #1.

See this thread for an example of what you should see.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.