Elasticsearch input plugin is not working

getting below while using elasticsearch input plugin ,

[2018--03-01T02:52:00,652][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x3c8282b7 sleep>"}
[2018-03-01T02:52:02,314][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Pipeline_id:main
  Plugin: <LogStash::Inputs::Elasticsearch hosts=>["https://elasticsearch.domain.com:9200"], user=>"elastic", password=><password>, ca_file=>"/opt/appl/sslcert.pem", ssl=>true, index=>"index_name", query=>"{ \"query\": { \"query_string\": { \"query\": \"*\" } } }", id=>"4cf31a401ca1da31981b181286e01c4e60c7236bce5a2be56d19c8a2c0138d6f", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_cd9e337b-7e73-4c2f-818e-fca2f8341011", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"]>
  Error: Failed to open TCP connection to https:0 (initialize: name or service not known)
  Exception: Faraday::ConnectionFailed
  Stack: org/jruby/ext/socket/RubyTCPSocket.java:137:in `initialize'
org/jruby/RubyIO.java:1154:in `open'

config file :

input {
     elasticsearch {
           hosts => [ "https://elasticsearch.domain.com:9200" ]
           user => "elastic"
           password => "password"
           ca_file => "/opt/appl/sslcert.pem"
           ssl => true
           index => "index_name"
           query => '{ "query": { "query_string": { "query": "*" } } }'
         }
     }
filter {
   }
output {
 stdout { codec => rubydebug }
    }

Kindly help to rectify the error.

Logstash cannot resolve the hostname as far as I can tell... replace the hostname with the ip and try again.

Hi @pjanzen,

thanks for your info ,

I tried by replacing with ip address ,

i'm getting below error ,

[2018-03-05T12:21:51,990][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-05T12:21:56,942][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-03-05T12:21:58,051][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Plugin: <LogStash::Inputs::Elasticsearch hosts=>["https://xxx.xxx.xxx.xxx:9200"], user=>"elastic", password=><password>, ca_file=>"/opt/appl/elasticsearch/sslcert.pem", ssl=>true, index=>"index_name", query=>"{ \"query\": { \"query_string\": { \"query\": \"*\" } } }", id=>"89224d2f13fae562e5191b62e7bd7e2d7e931e8e-1", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_d42c0efa-3669-4aab-93de-9a0066f1c575", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"]>
  Error: initialize: name or service not known
  Exception: Faraday::ConnectionFailed
  Stack: org/jruby/ext/socket/RubyTCPSocket.java:129:in `initialize'
org/jruby/RubyIO.java:1197:in `open'
/opt/appl/ELASTIC/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:763:in `connect'
org/jruby/ext/timeout/Timeout.java:98:in `timeout'
/opt/appl/ELASTIC/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:763:in `connect'
/opt/appl/ELASTIC/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:756:in `do_start'
/opt/appl/ELASTIC/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:745:in `start'
/opt/appl/ELASTIC/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1293:in `request'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:in `perform_request'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `call'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/http/faraday.rb:26:in `perform_request'
org/jruby/RubyProc.java:281:in `call'
/opt/appl/ELASTIC/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:262:in `perform_request' 

My current config:

input {
        elasticsearch {
           hosts => [ "https://xxx.xxx.xxx.xxx:9200" ]
           user => "elastic"
           password => "Password"
           ca_file => "/opt/appl/elasticsearch/sslcert.pem"
           ssl => true
           index => "index_name"
           query => '{ "query": { "query_string": { "query": "*" } } }'
         }
     }
filter {

   }
output {
 stdout { codec => rubydebug }

but elasticsearch output plugin is working with both hostnames and Ip Address

config :

output {
   elasticsearch {
        hosts => "https://domain.com:9200"
        user => "elastic"
        password => "Password"
        ssl => true
        ssl_certificate_verification => true
        cacert => '/opt/appl/elasticsearch/sslcert.pem'
        index => "test_%{index_day}"
        document_id => "%{uniq_id}"
   }

From what I can find on the internet (which you probably also have done) is that it could be your certificates. Are those not expired by any change? Also could you test this without SSL? Just to see if this will work?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.