Memory error in term facet


(Kartavya) #1

Hi,

I am running term facet query on Elastic search. I am having Index size 5GB
and total 2M documents.
On running term facet I am getting error for java heap space.

My index request is as follows

curl -XPUT 'localhost:9200/_river/pa_index2/_meta' -d '{
"type": "couchdb",
"couchdb" : {
"host" : "localhost",
"port": 5984,
"db" : "tweet_db"
},
"index" : {
"index" : "pa_index2",
"type" : "pa_index_feed2",
"bulk_size" : "100",
"bulk_timeout" : "10ms"},
"mapping":{"tweet":{"properties":{"polarity":{"type":"integer"}}}}
}
'

Please help.

Regards,
Pulkit Agrawal


(David Pilato) #2

You provide a river request not the facet request.
The way you get your documents have no relation to your facet searches.

You have also to give more details about your configuration : memory allocated
to ES (Xmx, ...), How many shards for your index, how many nodes, ...

FYI, I also get memory errors with facets over 2M docs but it's mainly beacause
I'm doing tests on small instances (Windows with only 1,5 Go allocated to ES).

Le 11 octobre 2011 à 10:12, Pulkit Agrawal pulkitdotcom@gmail.com a écrit :

Hi,

I am running term facet query on Elastic search. I am having Index size 5GB
and total 2M documents.
On running term facet I am getting error for java heap space.

My index request is as follows

curl -XPUT 'localhost:9200/_river/pa_index2/_meta' -d '{
"type": "couchdb",
"couchdb" : {
"host" : "localhost",
"port": 5984,
"db" : "tweet_db"
},
"index" : {
"index" : "pa_index2",
"type" : "pa_index_feed2",
"bulk_size" : "100",
"bulk_timeout" : "10ms"},
"mapping":{"tweet":{"properties":{"polarity":{"type":"integer"}}}}
}
'

Please help.

Regards,
Pulkit Agrawal
--
David Pilato
http://dev.david.pilato.fr/
Twitter : @dadoonet


(system) #3