Discover view is including a must_not match clause that excludes all results in the index

I'm trying to add an index for tracking metrics and have confirmed that documents are being added to the index using a simple match-all query. However, no results are showing up for the index in the discover page. Looking at the msearch query, i see that a must_not match clause is being included that excludes all of our results.

Here is the msearch query body:
{"index":["metrics-"],"ignore_unavailable":true,"preference":1568826385467}
{"query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"
"}},{"range":{"@timestamp":{"gte":1568523600000,"lte":1569128399999,"format":"epoch_millis"}}}],"must_not":[{"match_phrase":{"logger_name":{"query":"metrics"}}}]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{"highlight_query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"","all_fields":true}},{"range":{"@timestamp":{"gte":1568523600000,"lte":1569128399999,"format":"epoch_millis"}}}],"must_not":[{"match_phrase":{"logger_name":{"query":"metrics"}}}]}}}},"fragment_size":2147483647},"version":true,"size":500,"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"_source":{"excludes":},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"3h","time_zone":"America/Chicago","min_doc_count":1}}},"stored_fields":["*"],"script_fields":{},"docvalue_fields":["@timestamp"]}

When I access other indices on discover, I see that the same must_not clause for logger_name is being added as well:
{"index":["logstash-2019.09.19"],"ignore_unavailable":true,"preference":1568826385467}
{"query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":""}},{"range":{"@timestamp":{"gte":1568523600000,"lte":1569128399999,"format":"epoch_millis"}}}],"must_not":[{"match_phrase":{"logger_name":{"query":"metrics"}}}]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{"highlight_query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"","all_fields":true}},{"range":{"@timestamp":{"gte":1568523600000,"lte":1569128399999,"format":"epoch_millis"}}}],"must_not":[{"match_phrase":{"logger_name":{"query":"metrics"}}}]}}}},"fragment_size":2147483647},"version":true,"size":0,"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"3h","time_zone":"America/Chicago","min_doc_count":1}}},"stored_fields":[""],"script_fields":{},"docvalue_fields":["@timestamp"]}

Ive dug through settings for kibana and the indicies and can't find anything special for logger_name=metrics. There are no filters that are being applied here and logger_name and logger_name.keyword are not excluded in the index settings, they are both searchable and aggregateable. What am I missing here?

What version of Kibana are you running? There was a bug in an older version where filters could still be applied even though they didn't show up in the UI. If that's what's happening it should be possible to clear it out by clicking the "New" button in the top left corner of the Discover page.

It was an older version of kibana. Clicking New cleared out the filters.

Thanks!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.