413 Request Entity Too Large from sending data to ES through logstash?

Hi,

I am using filebeat(5.4.1) to send the logs to logstash(2.4.1) and doing some parsing there and sending the parsed data to ES(2.4.1).

When i am sending larger requests to ES it is throwing error like this:

Attempted to send a bulk request to Elasticsearch configured at '["http://x.x.x.x:9200"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided? {:error_message=>"[413] <html>\r\n<head><title>413 Request Entity Too Large</title></head>\r\n<body bgcolor=\"white\">\r\n<center><h1>413 Request Entity Too Large</h1></center>\r\n<hr><center>openresty/1.11.2.1</center>\r\n</body>\r\n</html>\r\n", :error_class=>"Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge", :backtrace=>["/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/base.rb:201:in `__raise_transport_error'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/base.rb:312:in `perform_request'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/client.rb:128:in `perform_request'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/actions/bulk.rb:93:in `bulk'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in `non_threadsafe_bulk'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:172:in `safe_bulk'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:101:in `submit'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:86:in `retrying_submit'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `multi_receive'", "org/jruby/RubyArray.java:1653:in `each_slice'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:28:in `multi_receive'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:129:in `worker_multi_receive'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:114:in `multi_receive'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301:in `output_batch'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:232:in `worker_loop'", "/home/itadmin/logstash/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :level=>:error}
[413] <html>
<head><title>413 Request Entity Too Large</title></head>
<body bgcolor="white">
<center><h1>413 Request Entity Too Large</h1></center>
<hr><center>openresty/1.11.2.1</center>
</body>
</html>

The full log is kept github

How can i avoid this ? I had seen this option http.max_content_length which defaults to 100mb is increasing it is a good practise?

Thanks

1 Like

What is the maximum and average size of your documents?

Thanks @Christian_Dahlqvist

Sorry the problem is I am running elasticsearch on NGINX so the error[quote="Yaswanth, post:1, topic:88846"]
413 Request Entity Too Large
[/quote]

is due to the NGINX not due to Elasticsearch . So, we set the configuration in NGINX to client_max_body_size 20M now i can able to send data without any error.

But this 20MB is for particular logstash event(i.e.particular document) ?

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.