hi,i hava a question ,my cluster is green ,but my logstasht hava this error
Oct 15 09:17:16 logstashmysql logstash[14839]: [2019-10-15T09:17:16,974][WARN ][logstash.outputs.elasticsearch] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: ...url=>http://172.31.1
Oct 15 09:17:16 logstashmysql logstash[14839]: [2019-10-15T09:17:16,975][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down! {:error_message=>"Elasticsearch Unreachable: [http:...
Oct 15 09:17:19 logstashmysql logstash[14839]: [2019-10-15T09:17:19,339][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.31.182.177:9200/"}
my logstash and elasticsearch configure is
node.name: elk1
node.master: true
node.data: false
path.data: /data/elasticsearch
path.logs: /var/log/elasticsearch
bootstrap.memory_lock: true
transport.tcp.compress: true
thread_pool:
write:
queue_size: 2000
network.host: 0.0.0.0
http.port: 9200
discovery.zen.ping.unicast.hosts: ["elk1", "elk2","elk3","elk4","elk5","elk6","elk7","elk8","elk9","elk10"]
gateway.recover_after_nodes: 4
input {
kafka {
bootstrap_servers => "172.31.182.167:9092,172.31.182.168:9092,172.31.182.169:9092"
topics => ["demo1"]
decorate_events => true
codec => json
}
}
filter {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp}"}
}
date {
match => ["timestamp","yyyy-MM-dd HH:mm:ss.SSS","yyyy-MM-dd HH:mm:ss.SSSSSS","ISO8601"]
target => "@timestamp"
}
}
output {
if[fields][logtype] == "ceph01_ceph" {
elasticsearch {
hosts => ["172.31.182.162:9200","172.31.182.163:9200","172.31.182.177:9200","172.31.182.203:9200","172.31.182.204:9200","172.31.182.205:9200","172.31.182.210:9200"]
index => "ceph-%{+YYYY.MM.dd}"
}
}
}