Hello Team,
I am trying to push data from a spark dataframe to Elastic using saveToES bulk api and I am ending up with the below issue.
I am using ES - 7.5.2 version.
message [failed to perform indices:data/write/bulk[s] on replica [prov_able][4], node[_Dm72t3LTYyI35_zXVGJfg], [R], s[STARTED], a[id=q8R-LIumRxWMPgIzwJiEGA]], failure [RemoteTransportException[[able_es3][10.48.169.118:9300][indices:data/write/bulk[s][r]]]; nested: CircuitBreakingException[[parent] Data too large, data for [<transport_request>] would be [1982508736/1.8gb], which is larger than the limit of [1973865676/1.8gb], real usage: [1982014768/1.8gb], new bytes reserved: [493968/482.3kb], usages [request=65760/64.2kb, fielddata=10448/10.2kb, in_flight_requests=164823858/157.1mb, accounting=7898329/7.5mb]]; ], markAsStale [true]]
org.elasticsearch.transport.RemoteTransportException: [able_es3][10.48.169.118:9300][indices:data/write/bulk[s][r]]
Caused by: org.elasticsearch.common.breaker.CircuitBreakingException: [parent] Data too large, data for [<transport_request>] would be [1982508736/1.8gb], which is larger than the limit of [1973865676/1.8gb], real usage: [1982014768/1.8gb], new bytes reserved: [493968/482.3kb], usages [request=65760/64.2kb, fielddata=10448/10.2kb, in_flight_requests=164823858/157.1mb, accounting=7898329/7.5mb]
at org.elasticsearch.indices.breaker.HierarchyCircuitBreakerService.checkParentLimit(HierarchyCircuitBreakerService.java:343) ~[elasticsearch-7.5.2.jar:7.5.2]
at org.elasticsearch.common.breaker.ChildMemoryCircuitBreaker.addEstimateBytesAndMaybeBreak(ChildMemoryCircuitBreaker.java:128) ~[elasticsearch-7.5.2.jar:7.5.2]
at org.elasticsearch.transport.InboundHandler.handleRequest(InboundHandler.java:171) ~[elasticsearch-7.5.2.jar:7.5.2]
at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:119) ~[elasticsearch-7.5.2.jar:7.5.2]
at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:103) ~[elasticsearch-7.5.2.jar:7.5.2]
Regards,
Sarath