I am getting below error on indexing records into ES cluster.
Exception in thread "elasticsearch[Ghost][transport_client_worker][T#8]{New
I/O worker #93}" java.lang.OutOfMemoryError: GC overhead limit exceeded.
We have 2 nodes cluster and each ES node has 10 GB of data.
Total number of indices are 500 and number of shards on each node is 500.
Total number of documents in cluster is around 6 millions
By default it is 256m and 1g.
just change the size according to yours requirement.
I hope this will reslove yours problem..
On Monday, October 21, 2013 11:57:50 PM UTC+5:30, Ankit Jain wrote:
Hi All,
I am getting below error on indexing records into ES cluster.
Exception in thread
"elasticsearch[Ghost][transport_client_worker][T#8]{New I/O worker #93}"
java.lang.OutOfMemoryError: GC overhead limit exceeded.
We have 2 nodes cluster and each ES node has 10 GB of data.
Total number of indices are 500 and number of shards on each node is 500.
Total number of documents in cluster is around 6 millions
This won't completely help you as it won't point out the exact OOM culprit,
but it will at least tell you what's going on with your JVM memory pools,
and if you correlate that to your other Elasticsearch and/or JVM metrics,
you may be able to trace this down more easily.
On Monday, October 21, 2013 2:27:50 PM UTC-4, Ankit Jain wrote:
Hi All,
I am getting below error on indexing records into ES cluster.
Exception in thread
"elasticsearch[Ghost][transport_client_worker][T#8]{New I/O worker #93}"
java.lang.OutOfMemoryError: GC overhead limit exceeded.
We have 2 nodes cluster and each ES node has 10 GB of data.
Total number of indices are 500 and number of shards on each node is 500.
Total number of documents in cluster is around 6 millions
Please find the size of min and max memory
ES_MIN_MEM = 256m
ES_MAX_MEM=10 GB
Thanks for your response..
On Tuesday, 22 October 2013 00:34:09 UTC+5:30, Mohit Kumar Yadav wrote:
hi ankit,
what is the size of the following...
ES_MIN_MEM = ?? (Minimum Size)
ES_MAX_MEM = ?? (Maximum Size)
By default it is 256m and 1g.
just change the size according to yours requirement.
I hope this will reslove yours problem..
On Monday, October 21, 2013 11:57:50 PM UTC+5:30, Ankit Jain wrote:
Hi All,
I am getting below error on indexing records into ES cluster.
Exception in thread
"elasticsearch[Ghost][transport_client_worker][T#8]{New I/O worker #93}"
java.lang.OutOfMemoryError: GC overhead limit exceeded.
We have 2 nodes cluster and each ES node has 10 GB of data.
Total number of indices are 500 and number of shards on each node is 500.
Total number of documents in cluster is around 6 millions
Please find the size of min and max memory
ES_MIN_MEM = 256m
ES_MAX_MEM=10 GB
Thanks for your response..
On Tuesday, 22 October 2013 00:34:09 UTC+5:30, Mohit Kumar Yadav wrote:
hi ankit,
what is the size of the following...
ES_MIN_MEM = ?? (Minimum Size)
ES_MAX_MEM = ?? (Maximum Size)
By default it is 256m and 1g.
just change the size according to yours requirement.
I hope this will reslove yours problem..
On Monday, October 21, 2013 11:57:50 PM UTC+5:30, Ankit Jain wrote:
Hi All,
I am getting below error on indexing records into ES cluster.
Exception in thread "elasticsearch[Ghost][transport_client_worker][T#8]{New I/O worker #93}" java.lang.OutOfMemoryError: GC overhead limit exceeded.
We have 2 nodes cluster and each ES node has 10 GB of data.
Total number of indices are 500 and number of shards on each node is 500.
Total number of documents in cluster is around 6 millions
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.