ES Breaker memory limit

Hi! When I make some search in Kibana, and I get some errors...

Elastic logs...

[2015-10-30 07:50:15,054][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 520090414 [495.9mb] from field [src_ip] would be larger than configured breaker: 519438336 [495.3mb], breaking
[2015-10-30 07:50:15,071][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 519924151 [495.8mb] from field [src_ip] would be larger than configured breaker: 519438336 [495.3mb], breaking
[2015-10-30 07:50:15,161][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 519925465 [495.8mb] from field [src_ip] would be larger than configured breaker: 519438336 [495.3mb], breaking
[2015-10-30 07:50:15,183][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 519626627 [495.5mb] from field [src_ip] would be larger than configured breaker: 519438336 [495.3mb], breaking
[2015-10-30 07:50:23,930][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 519734504 [495.6mb] from field [dst_host.raw] would be larger than configured breaker: 519438336 [495.3mb], breaking
[2015-10-30 07:50:32,267][WARN ][indices.breaker          ] [belerian] [FIELDDATA] New used memory 520170202 [496mb] from field [dst_host.raw] would be larger than configured breaker: 519438336 [495.3mb], breaking

The mahine have 15GB RAM

I have the ES_HEAP_SIZE set to 10GB

root@belerian:~# export
declare -x ES_HEAP_SIZE="10g"

And set the 50% of the memory to the fielddata limit (luecen searches?):

indices.breaker.fielddata.limit : "50%"

Anyway, when I make try to load a dashboard, the ES break...

I missing some configuration?

Thanks!

You should move things to doc values and add more heap.
You can also try clearing fielddata cache, but it is a short term solution only.

Have you some documentation where learn how to move things to doc values? Can I do this from Kibana search editor?

To solve the issue permanently how much memory RAM is necesary?

Take a look at https://www.elastic.co/guide/en/elasticsearch/guide/current/doc-values.html.

How much RAM really depends, I can't give you an answer there.

We are talking about 60 million of logs in 15 days... I can use curator to delete all older than 15 days... And keep the 60 million of logs +-

How much memory can I need to this data volume?

Thanks!!