I have elasticsearch 2.3.4 version installed. However i find that it goes out of memory quite often. The ES_HEAP_SIZE is set to 1.5gb(which is 50% of available RAM). I found out that it could be because of the setting "indices.fielddata.cache.size" which is unlimited by default. I tried changing it by adding entry "indices.fielddata.cache.size: 30%" in elasticsearch.yml. But however elasticsearch doesnt recognize the property "indices.fielddata.cache.size". I dont understand the issue. According to elasticsearch documentation this key should be recognized by elasticsearch. Any inputs why this issue?
What do you mean by
On adding indices.fielddata.cache.size: 30% in elasticsearch.yml, it throws parser exception like below:
Aug 3 14:48:33 server-001 elasticsearch: Exception in thread "main" SettingsException[Failed to load settings from [elasticsearch.yml]]; nested: ParserException[while parsing a block mapping Aug 3 14:48:33 server-001 elasticsearch: in 'reader', line 50, column 1: Aug 3 14:48:33 server-001 elasticsearch: indices.fielddata.cache.size: 30% Aug 3 14:48:33 server-001 elasticsearch: ^ Aug 3 14:48:33 server-001 elasticsearch: expected <block end>, but found BlockMappingStart Aug 3 14:48:33 server-001 elasticsearch: in 'reader', line 55, column 2: Aug 3 14:48:33 server-001 elasticsearch: network.host: localhost Aug 3 14:48:33 server-001 elasticsearch: ^ Aug 3 14:48:33 server-001 elasticsearch: ]; Aug 3 14:48:33 server-001 elasticsearch: Likely root cause: while parsing a block mapping Aug 3 14:48:33 server-001 elasticsearch: in 'reader', line 50, column 1: Aug 3 14:48:33 server-001 elasticsearch: indices.fielddata.cache.size: 30%
That looks like a broken config file, maybe some indentation issue? Hard to tell by just looking at the exception. I'd suggest you take a look at the config file around those lines that are showing up.
Oops...stupid me..I was missing a space . Thanks
WE don't usually recommend running ES with any decent amounts of data with less than 2GB of heap.