I have around 70gb index size and 33992470 doc index.
I am running facet queries on this index. Query are giving prompt response
if i am using single client to query.
But the same system hang and goes out of memory when using multiple clients
to query at the same time.
I have assigned 10g memory to the elastic search.
Can anybody guide me where is the problem and what is the resolution?
I tried flush api to recover memory but it didn't help.
If multiple clients are running and causing you problem, maybe that fact
that several search requests executing concurrently is the problem. Are you
asking for facets that return a lot of data? Also, monitor the JVM heap
memory usage (big desk can help here), if its really close to 10g, you need
to either increase the memory allocate to ES, or add more nodes to the
cluster.
I have around 70gb index size and 33992470 doc index.
I am running facet queries on this index. Query are giving prompt response
if i am using single client to query.
But the same system hang and goes out of memory when using multiple
clients to query at the same time.
I have assigned 10g memory to the Elasticsearch.
Can anybody guide me where is the problem and what is the resolution?
I tried flush api to recover memory but it didn't help.
I try to install big desk using "plugin -install lukas-vlcek/bigdesk" but
on running the same I got the error it will work for 0.19.x or 0.20.x.
I am using 0.18.7 what should we do to resolve it. I want to use big desk
to monitor my Elasticsearch Instance.
Please help.
Thanks,
Pulkit Agrawal
On Fri, May 4, 2012 at 7:27 PM, Shay Banon kimchy@gmail.com wrote:
If multiple clients are running and causing you problem, maybe that fact
that several search requests executing concurrently is the problem. Are you
asking for facets that return a lot of data? Also, monitor the JVM heap
memory usage (big desk can help here), if its really close to 10g, you need
to either increase the memory allocate to ES, or add more nodes to the
cluster.
I have around 70gb index size and 33992470 doc index.
I am running facet queries on this index. Query are giving prompt
response if i am using single client to query.
But the same system hang and goes out of memory when using multiple
clients to query at the same time.
I have assigned 10g memory to the Elasticsearch.
Can anybody guide me where is the problem and what is the resolution?
I tried flush api to recover memory but it didn't help.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.