I have ES 1.7, and I got an "Out of Memory Error, Java Heap Space"
I know the cause is that I don't have enough RAM for the amount of data I'm inserting ES. Unfortunately, new hardware is coming only in a few weeks.
My questions are:
How to recover the index (it's status is yellow) so that I could keep using it?
How to insert small amounts of data so it won't happen again? (I'm using logstash 2.1 and filebeat, and have limited the bulk_max_size to 20)
But I still have a problem... When looking at the index on kibana it gives me the OOM error and I can't query the index.
I assume it's the problem since it happens when I'm inserting large amounts of data. For example I first had bulk_max_size defined to 50 (it's the default) and I got OOM after an hour or so... When changing to bulk_max_size to 20 it worked for almost a week, inserting 130 million documents.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.