We have a very high number of filter evictions per query (~10).
I tried to enlarge the cache size to 30%, but still, I see this number at
~9.
I'm trying to debug this (I think we might have huge queries, effecting
this) - so I wanted to dump the cache, or something similar to try to
understand what is wasting all that space.
Is there any way I can do this? Or any other recommended way to debug?
We have a very high number of filter evictions per query (~10).
I tried to enlarge the cache size to 30%, but still, I see this number at
~9.
I'm trying to debug this (I think we might have huge queries, effecting
this) - so I wanted to dump the cache, or something similar to try to
understand what is wasting all that space.
Is there any way I can do this? Or any other recommended way to debug?
Mark, thanks so much.
I tried to take a heap dump, but using jmap always takes the server out of
the cluster..
These are production servers, any way to do it live, without impacting the
cluster?
Thanks.
On Tuesday, March 3, 2015 at 11:11:02 PM UTC+2, Mark Walkom wrote:
You cannot see what is in the cache unless you extract a heap dump.
On 4 March 2015 at 02:56, Roy Reznik <r...@adallom.com <javascript:>>
wrote:
Hi,
We have a very high number of filter evictions per query (~10).
I tried to enlarge the cache size to 30%, but still, I see this number at
~9.
I'm trying to debug this (I think we might have huge queries, effecting
this) - so I wanted to dump the cache, or something similar to try to
understand what is wasting all that space.
Is there any way I can do this? Or any other recommended way to debug?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.