Hi all,
I am trying to index a 6 GB file to my cluster of three nodes. But i get circuit breaking exception every time.
I have tried this increasing the limit by,
"persistent" : { "indices.breaker.fielddata.limit" : "95%", "indices.breaker.total.limit" : "95%" }
but no effect.
I use a python script to index the documents and this is the message that i get from python library.
Traceback (most recent call last): File "json_array.py", line 34, in <module> deque(pb, maxlen=0) File "/usr/lib/python2.7/site-packages/elasticsearch/helpers/actions.py", line 365, in parallel_bulk actions, chunk_size, max_chunk_bytes, client.transport.serializer File "/usr/lib64/python2.7/multiprocessing/pool.py", line 655, in next raise value elasticsearch.exceptions.TransportError: TransportError(429, u'circuit_breaking_exception', u'[parent] Data too large, data for [<http_request>] would be [1749216548/1.6gb], which is larger than the limit of [1717986918/1.5gb], real usage: [1748416424/1.6gb], new bytes reserved: [800124/781.3kb]')
You can find elastic logs in this link
can you guys suggest anything, i am stuck here for days.
Thanks