Elasticsearch connection refused

Hi, I have a problem in my ELK setup with below log messages.

{:timestamp=>"2016-11-28T21:10:22.958000+1100", :message=>"Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200"]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :level=>:error}
{:timestamp=>"2016-11-28T21:17:31.787000+1100", :message=>"SIGTERM received. Shutting down the agent.", :level=>:warn}
{:timestamp=>"2016-11-28T21:17:31.788000+1100", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-11-28T21:17:32.032000+1100", :message=>"Pipeline main has been shutdown"}
{:timestamp=>"2016-11-28T21:37:19.359000+1100", :message=>"Pipeline main started"}
{:timestamp=>"2016-11-28T22:05:15.440000+1100", :message=>"SIGTERM received. Shutting down the agent.", :level=>:warn}
{:timestamp=>"2016-11-28T22:05:15.443000+1100", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-11-28T22:05:15.977000+1100", :message=>"Pipeline main has been shutdown"}

need a full understanding of what went wrong with my installation.

Thanks

Elasticsearch appears to be unreachable or down!

Most likely the reason.

Hi,

Is there a compatibility issue with the package version ?

am using the following package version.

logstash 2.3
kibana 4.5
elasticsearch 2.x
nginx 1.10
ubuntu 16.04

Thansk

I have below result when I check the elasticsearch using curl.

curl -4 localhost:9200
{
"name" : "Iceman",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "2W1HX0GST1eBDD1w9-ShPA",
"version" : {
"number" : "2.4.2",
"build_hash" : "161c65a337d4b422ac0c805f284565cf2014bb84",
"build_timestamp" : "2016-11-17T11:51:03Z",
"build_snapshot" : false,
"lucene_version" : "5.5.2"
},
"tagline" : "You Know, for Search"

Thanks

What is your Logstash config?

Hi Dadoonet,

Below is my logstash.conf

input {
beats {
port => 5044
ssl => true
#ssl_certificate => "/etc/ssl/logstash-forwarder.crt"
#ssl_key => "/etc/ssl/logstash-forwarder.key"
ssl_certificate => "/etc/pki/tls/cert/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
congestion_threshold => "40"
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGLINE}" }
}

date {

match => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}

}

output {
elasticsearch {
hosts => localhost
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}

Thanks

Please format your code using </> icon. It will make your post more readable.

If you remove the connection to elasticsearch, can you see your lines of logs printed to the console?

I don't see anything obvious from your config here. Moving your question to #logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.