Circuit_breaking_exception [parent] Data too large, data for [<http_request>]

I'm trying to insert bulk of data( ~60M docs) using spark(39 partitions, 10 concurrent)
and I get the following exception:

org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: org.elasticsearch.hadoop.rest.EsHadoopRemoteException: circuit_breaking_exception: [parent] Data too large, data for [<http_request>] would be [146104859024/13.7gb], which is larger than the limit of [146103767449/13.2gb], real usage: [146104859024/13.7gb], new bytes reserved: [0/0b]

I saw on this post that I can disable indices.breaker.total.use_real_memory but I wondered If there is a better solution

es_version: 7.7.1
java_version: elastic bundled OpenJDK 14.0.1
GC config: -XX:+UseMarkConcMarkSweepGC