- We are observing intermittent connectivity issues from logstash to elasticsearch:
- Following error is seen in the logstash logs intermittently:
{:timestamp=>"2019-03-05T05:27:32.888000+0100", :message=>"Attempted to send a bulk request to Elasticsearch configured at '["https://xyz.abc.com/elasticsearch/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :error_message=>"xyz.abc.com:443 failed to respond", :error_class=>"Manticore::ClientProtocolException", :backtrace=>["/ABC/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:37:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "/ABC/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:79:in `call'", "/ABC/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:256:in `call_once'", "/ABC/logstash-2.4.1/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:153:in `code'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/http/manticore.rb:84:in `perform_request'", "org/jruby/RubyProc.java:281:in `call'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/base.rb:257:in `perform_request'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.1.0/lib/elasticsearch/transport/client.rb:128:in `perform_request'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/actions/bulk.rb:93:in `bulk'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in `non_threadsafe_bulk'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:172:in `safe_bulk'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:101:in `submit'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:86:in `retrying_submit'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `multi_receive'", "org/jruby/RubyArray.java:1653:in `each_slice'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:28:in `multi_receive'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:129:in `worker_multi_receive'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:114:in `multi_receive'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301:in `output_batch'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:232:in `worker_loop'", "/ABC/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :level=>:error}
- Our logstash connectivity is as follows:
Logstash > Apache HTTPS > Proxy Pass to http://elasticsearch-IP:9200/elasticsearch - Following is the logstash output configuration:
output {
stdout
{
codec => rubydebug
}elasticsearch
{
hosts => "https://xyz.abc.com"
path => "/elasticsearch"
}
}
- Our Logstash version is "2.4.1" and Elasticsearch version is "2.4.2"
- I also need to know, whether, there will be data loss or data mismatch when logstash - elasticsearch connectivity fails with the above error ?
Please let us know if any thing needs to be checked from our side to avoid connectivity issues from logstash to elasticsearch