Receive this error


#1

Hi all,
i'm receiving the following errors every 2-5 mins:

[2018-01-10T09:55:35,117][WARN ][logstash.outputs.elasticsearch] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://elastic:xxxxxx@127.0.0.1:9200/][Manticore::SocketTimeout] Read timed out {:url=>http://elastic:xxxxxx@127.0.0.1:9200/, :error_message=>"Elasticsearch Unreachable: [http://elastic:xxxxxx@127.0.0.1:9200/][Manticore::SocketTimeout] Read timed out", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
[2018-01-10T09:55:35,117][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down! {:error_message=>"Elasticsearch Unreachable: [http://elastic:xxxxxx@127.0.0.1:9200/][Manticore::SocketTimeout] Read timed out", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,160][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,161][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,192][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,192][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,204][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,204][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,204][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,204][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,253][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,254][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,261][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,261][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:35,268][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:35,269][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:36,031][WARN ][logstash.outputs.elasticsearch] UNEXPECTED POOL ERROR {:e=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError: No Available connections>}
[2018-01-10T09:55:36,031][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-01-10T09:55:36,700][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@127.0.0.1:9200/, :path=>"/"}
[2018-01-10T09:55:36,701][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@127.0.0.1:9200/"}
[2018-01-10T09:55:37,573][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@127.0.0.1:9200/, :path=>"/"}
[2018-01-10T09:55:37,574][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@127.0.0.1:9200/"}

What is the problem? Can you help me?


(David Pilato) #2

Please format your code. I edited your post.

Perhaps Elasticsearch is unreachable or down?

Anything in elasticsearch logs?


#3

Nothing....elastic process is up and running without error,
and other ls node are correctly connected and are ingesting data!

Also after a restart i receive the error...


(David Pilato) #4

So elasticsearch seems ok. I'm moving your question to #logstash but a first guess is that you have troubles to access elasticsearch from the host where your LS instance is running.


#5

I add, as you can see, LS and ES are running in the same host.


(Krunal Kalaria) #6

hi @sbampa ,
can you show me your conf file first.


#7

input {
beats {
port => "5043"
}
file {
path => [ "/opt/logs/sito02/wildfly-auth-sito2/server.log", "/opt/logs/sito02/wildfly-intserv-sito2/server.log", "/opt/logs/sito02/wildfly-wallet-sito2/server.log", "/opt/logs/sito03/wildfly
-intserv-sito3/server.log", "/opt/logs/sito03/wildfly-wallet-sito3/server.log", "/opt/logs/sito03/wildfly-auth-sito3/server.log", "/opt/logs/sito04/wildfly-info-sito4/server.log", "/opt/logs/sito04/
wildfly-sport-sito4/server.log", "/opt/logs/sito04/wildfly-auth-sito4/server.log", "/opt/logs/sito05/wildfly-auth-sito5/server.log", "/opt/logs/sito05/wildfly-sport-sito5/server.log", "/opt/logs/sit
o05/wildfly-gaming-sito5/server.log", "/opt/logs/sito06/wildfly-info-sito6/server.log", "/opt/logs/sito06/wildfly-sport-sito6/server.log", "/opt/logs/sito06/wildfly-sportschedule-sito6/server.log",
"/opt/logs/sito07/wildfly-gaming-sito7/server.log", "/opt/logs/sito07/wildfly-sport-sito7/server.log", "/opt/logs/sito08/wildfly-sport-sito8/server.log", "/opt/logs/sito08/wildfly-intserv-sito8/serv
er.log", "/opt/logs/sito09/wildfly-sport-sito9/server.log", "/opt/logs/sito09/wildfly-intserv-sito9/server.log", "/opt/logs/sito10/wildfly-sportschedule-sito10/server.log", "/opt/logs/sito11/wildfly
-info-sito11/server.log", "/opt/logs/sito11/wildfly-sale-sito11/server.log", "/opt/logs/sito12/wildfly-info-sito12/server.log", "/opt/logs/sito12/wildfly-sale-sito12/server.log", "/opt/logs/sito13/w
ildfly-info-sito13/server.log", "/opt/logs/sito13/wildfly-sale-sito13/server.log", "/opt/logs/sito14/wildfly-info-sito14/server.log", "/opt/logs/sito14/wildfly-sale-sito14/server.log", "/opt/logs/si
to15/wildfly-intserv-sito15/server.log", "/opt/logs/sito15/wildfly-sport-sito15/server.log" ]
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => "previous"
}
exclude => ".gz"
add_field => { "log_type" => "sitonew" }
start_position => "end"
sincedb_path => "/vol1/since.db"
}
file {
path => [ "/opt/logs/betradar-live/betradar_live_odds_psqf3/echo/
.echo" ]
codec => multiline {
pattern => "^%{HTTPDERROR_DATE} "
negate => true
what => "previous"
}
exclude => ".gz"
add_field => { "log_type" => "betradar_live" }
start_position => "end"
sincedb_path => "/dev/null"
}
file {
path => [ "/opt/logs/wildfly-betgen01/wildfly/server.log" ]
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => "previous"
max_lines => 4000
}
exclude => "
.gz"
add_field => { "log_type" => "betgenius" }
start_position => "end"
sincedb_path => "/vol1/since2.db"
}
}
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} "]
match => ["message", "%{HTTPDERROR_DATE:timestamp} (%{NUMBER:bytes}) "]
match => ["message", "%{TIME:timestamp} %{LOGLEVEL:level}"]
match => ["message", "%{BIND9_TIMESTAMP:timestamp}"]
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss,SSS", "HH:mm:ss,SSS", "EEE MMM dd HH:mm:ss YYYY", "dd-MMM-yyyy HH:mm:ss.SSS" ]
timezone => "Europe/Rome"
target => "@timestamp"
}
mutate {
remove_field => [ "timestamp","offset","level","@version","input_type","beat","type","tags","sort"]
}

}
output {
if [fields][log_type] == "totem"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "totem-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
else if [fields][log_type] == "backoffice-agenzie"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "backoffice-agenzie-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
} }else if [fields][log_type] == "backoffice-interno"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "backoffice-interno-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
} else if [log_type] == "betgenius"{
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "betgenius-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}else if [fields][log_type] == "worldmatch"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "worldmatch-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
} }
else if [fields][log_type] == "sito-gioco"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "sito-gioco-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
else if [fields][log_type] == "prodven1_sport_psqf"{
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "prodven1_sport_psqf%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}else if [fields][log_type] == "prodven1_sport_web"{
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "prodven1_sport_web-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
} else if [fields][log_type] == "mob-prodpal"{
elasticsearch {
hosts => [ "172.16.0.4:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "mob-prodpal-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
} else {
elasticsearch {
hosts => [ "127.0.0.1:9200", "172.16.0.5:9200", "172.16.0.6:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "others-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
}


(Krunal Kalaria) #8

else if [log_type] == "betgenius"{
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
user => elastic
password => "changeme"
manage_template => false
index => "betgenius-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}

in else if [log_type] == "betgenius" line you write only [log_type] and other else if condition you write [fields][log_type] is that correct ?

if i am not wrong in output so many ip of elasticsearch is their i dont know i think issue with this coz once i was same error is throwing but in mine its solved problem is in my elastic ip so you see that above line is correct or not.

Thanks & Regards,
Krunal.


#9

Problem in only with logtype backoffice-*.
and appear in both case: [fields][log_type] and only [log_type]...
:frowning:


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.