Failed to flush the buffer ||Data too large, data for|| could not push logs to Elasticsearch cluster

Hi all i am facing this issue since long and not able to control the buffer file whenever large amount of data is ingested its get chocked any help would be apprecitated.

2023-05-24 17:34:11 +0300 [warn]: #0 failed to flush the buffer. retry_time=0 next_retry_seconds=2023-05-24 17:34:12 +0300 chunk="5fc71656b9fecdfe37463c8e4d16ef5c" error_class=Fluent::Plugin::ElasticsearchOutput::RecoverableRequestFailure error="could not push logs to Elasticsearch cluster ({:host=>"172.23.12.140", :port=>9200, :scheme=>"http"}): [429] {"error":{"root_cause":[{"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [14197261530/13.2gb], which is larger than the limit of [14131593216/13.1gb], real usage: [14197237728/13.2gb], new bytes reserved: [23802/23.2kb], usages [request=1224/1.1kb, fielddata=4857399338/4.5gb, in_flight_requests=23802/23.2kb, accounting=1862870416/1.7gb]","bytes_wanted":14197261530,"bytes_limit":14131593216,"durability":"PERMANENT"}],"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [14197261530/13.2gb], which is larger than the limit of [14131593216/13.1gb], real usage: [14197237728/13.2gb], new bytes reserved: [23802/23.2kb], usages [request=1224/1.1kb, fielddata=4857399338/4.5gb, in_flight_requests=23802/23.2kb, accounting=1862870416/1.7gb]","bytes_wanted":14197261530,"bytes_limit":14131593216,"durability":"PERMANENT"},"status":429}"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.