Error sending bulk request do Elasticsearch

I'm able to read some data from hive table throught jdbc connection but when it will be inserted on elastic i have the following error:

[ERROR] 2019-05-28 17:39:36.643 [[main]>worker1] elasticsearch - An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"no method 'setHeaders' for arguments (org.jruby.java.proxies.ArrayJavaProxy) on Java::OrgApacheHttpClientMethods::HttpPost\n available overloads:\n (org.apache.http.Header)\n (org.apache.http.Header)", :error_class=>"NameError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.6.4-java/lib/manticore/client.rb:504:in request_from_options'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.6.4-java/lib/manticore/client.rb:421:in request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.6.4-java/lib/manticore/client.rb:266:in post'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:69:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:143:in bulk_send'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:128:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:296:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:201:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:169:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}

Im using elastic 7.1.1 with logstash 7.1.1. Anyone know what are happening?

1 Like

Which version of Java? What does the elasticsearch output configuration look like?

output config:

output {
elasticsearch {
index => "hive"
hosts => ["127.0.0.1"]
}
}

java:
openjdk version "1.8.0_181"
OpenJDK Runtime Environment (build 1.8.0_181-b13)
OpenJDK 64-Bit Server VM (build 25.181-b13, mixed mode)

Shoud i upgrade to Java 11?

I'm experiencing the same error described by luisdiasbh using the JDBC input plugin with Hive-2.1 JDBC drivers. My config file, which has been working in Logstash 6.0, is not working with Logstash 7.0 or Logstash 7.1 running on JDK 1.8.0_XXX. I am also using Elasticsearch 7.1, which works fine with my config file in Logstash 6.0 but not 7.0.

In Logstash 7.0 and 7.1, I can see my input in the rubydebug output codec, so my Hive JDBC driver and connection string is working, just not interacting nicely with Elasticsearch.

Any suggestions or insights to something that got changed in Logstash 7.0 that may cause this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.