Bulk error in logstash

I am getting the below error from logstash frequently. Is this because of elasticsearch?

[2018-10-26T08:31:12,884][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-10-26T08:31:12,883][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}
[2018-10-26T08:31:12,882][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>2}

I am sure that elasticsearch is healthy and I can curl to all nodes in the cluster.
Any other configuration I can do, to reduce the bulk size in logstash or increase the acceptable limit it elasticsearch.

Were you inserting bulk data, or at which time this issue is occurring frequently.

I am using logstash elasticsearch output plugin. I think it is using the bulk api only.

in your input code, were you retrieving data from database? or from any APIs?

I am retrieving data from kafka and pushing it to elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.