Query shard exeption - all shards failed

Hi, I'm encountering a problem while indexing some data. At the end of the post you can find the error message I can see from Logstash logs.
It seems that Logstash (which is using a Elasticsearch filter plugin to enrich some data) can't complete the query because of an issue on Elasticsearch.
If I query Elasticsearch from Kibana I get Courier fetch: 1 of 2 shards failed.

I tried to delete the index and re-index all data but the problem is always occurring.
This is not the first time I use this Logstash instance to index some data with this filter, and the thing has always worked properly.

Could you please help me understanding what is going wrong?

Error message from Logstash:

[2018-07-04T07:00:49,719][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"logstash-zonabng", :query=>"bng_hostname:%{[bng_name]}", :event=>#<LogStash::Event:0x2f8b2082>, :error=>#<Elasticsearch::Transport::Transport::Errors::BadRequest: [400] {"error":{"root_cause":[{"type":"query_shard_exception","reason":"Failed to parse query [bng_hostname:%{[bng_name]}]","index_uuid":"xH492mfbTAiuFe_MGcWLSQ","index":"logstash-zonabng"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"logstash-zonabng","node":"Q-diM7ALQqy8WYRA-hs5hg","reason":{"type":"query_shard_exception","reason":"Failed to parse query [bng_hostname:%{[bng_name]}]","index_uuid":"xH492mfbTAiuFe_MGcWLSQ","index":"logstash-zonabng","caused_by":{"type":"parse_exception","reason":"Cannot parse 'bng_hostname:%{[bng_name]}': Encountered \" \"]\" \"] \"\" at line 1, column 24.\nWas expecting:\n \"TO\" ...\n ","caused_by":{"type":"parse_exception","reason":"Encountered \" \"]\" \"] \"\" at line 1, column 24.\nWas expecting:\n \"TO\" ...\n "}}}}]},"status":400}>}

as i know this error. see you data and filter or mapping.

The filter is correct since always worked with the data. From my understanding, the query is failing because a shard problem on Elasticsearch

This is the Logstash-Elasticsearch query filter. What happens to the document when the query fails? Should it be indexed in any case without the information retrieved from Elasticserach? Can this query be improved providing some sort of condition (if query fails then...)

    elasticsearch {
            hosts => ["elasticsearch:9200"]
            index => ["logstash-zonabng"]
            query => "bng_hostname:%{[bng_name]}"
            fields => {"zona_bng" => "zona_bng"}
    }

don't use this and try. use default query

1 Like

The thing seems to be working now, but now I'm more interested to understand:

  1. Why, if Logstash failed to query Elasticsearch I have a shard exeption from Kibana (should the two thing be uncorrelated?)
  2. Which is the best practice to deal with these situation? I mean, I have an Elasticsearch query on Logstash that could fail sometimes. How can I avoid that this failure affects the whole shard?

this filed with data error, logstash can't display this.

I would like to prevent Logstash to query Elasticsearch if _elasticsearch_lookup_failure is in the tags, but apparently this thing is not working. I have tried to put a

if "_elasticsearch_lookup_failure" not in [tags] {

in the filter plugin, but it is not doing the job

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.