Encountered an unexpected error submitting a bulk request! Will retry


#1

I am installing/testing logstash 5.1.1 I am available to send the logs to elasticsearch this is my output:

output {
  elasticsearch {
    timeout    => 30
    hosts      => ["10.10.23.183"]
    index      => "logstash-%{_idx_env_suffix}-%{_idx_type_suffix}-%{+YYYY.MM.dd}"
  }
}

But after couple of hours I start go get this error and therefore elasticsearch don't receive/show nothing:

[2017-01-15T11:05:10,371][ERROR][logstash.outputs.elasticsearch] Encountered an unexpected error submitting a bulk request! Will retry. {:error_message=>"undefined methodresponse' for #LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError:0x543b5904", :class=>"NoMethodError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:223:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:187:insafe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:109:in submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:76:inretrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:27:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:12:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:42:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:331:inoutput_batch'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:inoutput_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:instart_workers'"]}`


(Mark Walkom) #2

Anything in ES logs?


#3

On Elasticsearch I just get multiple times this:

[2017-01-17T09:55:06,783][WARN ][o.e.l.LicenseService     ] [testing.test]
#
# License [will expire] on [Thursday, January 19, 2017]. If you have a new license, please update it.
# Otherwise, please reach out to your support contact.
#
# Commercial plugins operate with reduced functionality on license expiration:
# - security
#  - Cluster health, cluster stats and indices stats operations are blocked
#  - All data operations (read and write) continue to work
# - watcher
#  - PUT / GET watch APIs are disabled, DELETE watch API continues to work
#  - Watches execute and write to the history
#  - The actions of the watches don't execute
# - monitoring
#  - The agent will stop collecting cluster and indices metrics
#  - The agent will stop automatically cleaning indices older than [xpack.monitoring.history.duration]
# - graph
#  - Graph explore APIs are disabled

But nothing relevant to the logstash error:

[2017-01-17T09:59:36,187][ERROR][logstash.outputs.elasticsearch] Encountered an unexpected error submitting a bulk request! Will retry. {:error_message=>"undefined methodresponse' for #LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError:0x7d8581c0", :class=>"NoMethodError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:223:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:187:insafe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:109:in submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:76:inretrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:27:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:12:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:42:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:331:inoutput_batch'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:inoutput_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:instart_workers'"]}`


#4

Giving a try to this update:

bin/logstash-plugin install --version "6.2.0" logstash-output-elasticsearch


#5

Ended removing logstash in favor off using filebeat to write directly to elasticsearch.


(DeLoVaN) #6

Hi, I was having the same problem as you.
I was parsing huge logs files (for a total of 150Go), and after 30 min, the process just stopped with the kind of errors you got.
After a day of hard debugging, I concluded that the problem was with the plugin logstash-output-elasticsearch.
The version bundle with logstash 5.1.2 is the version 5.4 of the plugin. Upgrade it to the last version (6.2.4 at the time I write this message), and the errors are gone, logstash continue to forward to elasticsearch.

To update your plugin, do the following (on Debian) :

sudo /usr/share/logstash/bin/logstash-plugin update logstash-output-elasticsearch

(DeLoVaN) #7

Today, the update to logstash 5.2 will automatically update the plugin to 6.2.4. Enjoy.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.