How to decrease terms_memory footprint


#1

Hi! I'm using ES 2.0 to index billions of docs a day (daily indexes for long term usage). The cluster is relatively small (10 data-nodes (heap 30Gb), 3 dedicated masters). Right now I can't increase data-nodes much and looking for affordable solution.

The heap is constantly over 90% (mostly by segments.memory_in_bytes). How can I limit the biggest memory eater - terms_memory_in_bytes ?

The old way to limit was removed long time ago ( https://github.com/elastic/elasticsearch/issues/3912 ), but I wonder if any other way to do this in ES 1.7 / 2.x?

As an option I can open and close older indexes at query time (open -> search -> close), but IMHO it's time/resource consuming decision. The main disadvantage is the time for open/close operation for periods like a two or three month and the fact that ES can decide to rebalance shards for such indexes.


Segments memory_in_bytes excessively large with allot of open indices
Segments memory_in_bytes excessively large with allot of open indices
(system) #2