Recover from Out of Memory Error


I have ES 1.7, and I got an "Out of Memory Error, Java Heap Space"
I know the cause is that I don't have enough RAM for the amount of data I'm inserting ES. Unfortunately, new hardware is coming only in a few weeks.

My questions are:

  1. How to recover the index (it's status is yellow) so that I could keep using it?
  2. How to insert small amounts of data so it won't happen again? (I'm using logstash 2.1 and filebeat, and have limited the bulk_max_size to 20)


Yellow is ok-ish, it means it cannot assign a replica. You can remove the replica and then readd it via the APIs.

How do you know this was the issue?

Hi, thanks for the answer

  1. But I still have a problem... When looking at the index on kibana it gives me the OOM error and I can't query the index.

  2. I assume it's the problem since it happens when I'm inserting large amounts of data. For example I first had bulk_max_size defined to 50 (it's the default) and I got OOM after an hour or so... When changing to bulk_max_size to 20 it worked for almost a week, inserting 130 million documents.

Any help?

Please post the full error you receive.

On kibana I see:
"ElasticsearchException[java.lang.OutOfMemoryError: Java heap space]"

I can't get you the ES logs right now since it on a closed network.