Failed to flush outgoing items to AWS ES

Hi community

I use Logstash and AWS Elasticsearch to centralized the logs.
It usually works but I got this problem when we do load test.

error_message=>"[413] {\"Message\":\"Request size exceeded 10485760 bytes\"}"
error_class=>"Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge"

Here is the log of Logstash.

[2019-11-05T00:00:34,272][WARN ][logstash.outputs.amazones] Failed to flush outgoing items {:outgoing_count=>6761, :exception=>"Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge", :backtrace=>["D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:202:in __raise_transport_error'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:319:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es/aws_transport.rb:48:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/client.rb:131:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-api-5.0.4/lib/elasticsearch/api/actions/bulk.rb:95:in bulk'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es/http_client.rb:53:in bulk'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:321:in block in submit'", "org/jruby/ext/thread/Mutex.java:148:in synchronize'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:318:in submit'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:351:in flush'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:219:in block in buffer_flush'", "org/jruby/RubyHash.java:1343:in each'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:in buffer_flush'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in buffer_receive'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:311:in receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in block in multi_receive'", "org/jruby/RubyArray.java:1734:in each'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/output_delegator.rb:50:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:487:in block in output_batch'", "org/jruby/RubyHash.java:1343:in each'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:486:in output_batch'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:438:in worker_loop'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:393:in block in start_workers'"]}`

[2019-11-05T00:00:36,443][ERROR][logstash.outputs.amazones] Attempted to send a bulk request to Elasticsearch configured at '["https://blablabla.es.amazonaws.com:443"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided? {:client_config=>{:hosts=>["https://blablabla.ap-southeast-1.es.amazonaws.com:443"], :region=>"ap-southeast-1", :transport_options=>{:request=>{:open_timeout=>0, :timeout=>60}, :proxy=>nil, :headers=>{"Content-Type"=>"application/json"}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::AWS}, :error_message=>"[413] {\"Message\":\"Request size exceeded 10485760 bytes\"}", :error_class=>"Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge", :backtrace=>["D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:202:in __raise_transport_error'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:319:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es/aws_transport.rb:48:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/client.rb:131:in perform_request'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/elasticsearch-api-5.0.4/lib/elasticsearch/api/actions/bulk.rb:95:in bulk'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es/http_client.rb:53:in bulk'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:321:in block in submit'", "org/jruby/ext/thread/Mutex.java:148:in synchronize'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:318:in submit'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:351:in flush'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:219:in block in buffer_flush'", "org/jruby/RubyHash.java:1343:in each'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:in buffer_flush'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in buffer_receive'", "D:/elk/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-2.0.1-java/lib/logstash/outputs/amazon_es.rb:311:in receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in block in multi_receive'", "org/jruby/RubyArray.java:1734:in each'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/output_delegator.rb:50:in multi_receive'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:487:in block in output_batch'", "org/jruby/RubyHash.java:1343:in each'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:486:in output_batch'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:438:in worker_loop'", "D:/elk/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:393:in block in start_workers'"]}`

Let me know the cause of this issue and where I can fix the size?
Thanks for the any idea.

I found the root cause

And here is solution

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.