CircuitBreakingException: [parent] Data too large, data error

Hi People,

I'm facing an issue that keeps crashing our cluster. Luckily it is our acceptance cluster, but still a problem because people use the cluster to log their errors. We use Elasticsearch 6.8

The error that we retrieve from a master node:
[2020-06-25T16:05:11,629][WARN ][r.suppressed ] [es-server01-acc.eu] path: /_bulk, params: {}
org.elasticsearch.common.breaker.CircuitBreakingException: [parent] Data too large, data for [<http_request>] would be [11229874195/10.4gb], which is larger than the limit of [11225477939/10.4gb], usages [request=0/0b, fielddata=0/0b, in_flight_requests=11226443799/10.4gb, accounting=3430396/3.2mb]

We increased the instance type (AWS) and doubled the resources. We changed the JVM memory setting from 6 to 12 gb,

The settings we use in elasticsearch.yml

    node.name: redacted
    node.master: true
    node.data: true
    path.logs: /var/log/elasticsearch
    path.data: /data
    path.repo: /data/repo
    bootstrap.memory_lock: true
    indices.memory.index_buffer_size: 15%
    #indices.breaker.total.use_real_memory: true
    thread_pool.search.min_queue_size: 1000
    network.host: 0.0.0.0
    discovery.zen.minimum_master_nodes: 3

And we use the following JVM settings, well the settings we've changed compared to the default settings:

    # Xmx represents the maximum size of total heap space
    -Xms15G
    -Xmx15G

The actual error:

[2020-06-25T16:05:11,629][WARN ][r.suppressed ] [es-server01-acc.eu] path: /_bulk, params: {}
org.elasticsearch.common.breaker.CircuitBreakingException: [parent] Data too large, data for [<http_request>] would be [11229874195/10.4gb], which is larger than the limit of [11225477939/10.4gb], usages [request=0/0b, fielddata=0/0b, in_flight_requests=11226443799/10.4gb, accounting=3430396/3.2mb]

1 Like

Following. Have same issues/symptoms.

What I have found is that the max recommended indices.memory.index_buffer_size is 512MB?!
Please read

I am now confused...

Isn't this telling you your bulk insert is just too large, as 10GB is enormous - I'm not sure what is included in this and how bulk size affects it, but what are you sending and how many fields, etc. does it have? Try setting bulk size to 1 maybe and looking at what you are sending.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.