RESOLVED: Search Bug in 5.6.12 when searching with an Alias

I'm running 5.6.12 (both Kibana and ES). I have an alias that points to 31 indices. In Kibana I have two index patterns.:

  • panorama-staging -- an alias
  • *staging*-ivr-auth-master-* -- the raw pattern used to create the alias

If i use the _cat/aliases/panorama-staging endpoint, and compare it with the _cat/indices/*staging*-ivr-auth-master-* endpoint, they both contain the same 31 indices.

If I enter the following search in the kibana search bar container_name: debug-receiver I get different results depending on the index pattern. If it's the alias, I get 0 results, no matter the time bucket. If it's the * pattern I get results as expected.

If I use the search API for either, I get expected results back (both return ~ the same number of hits, we're ingesting data, so variance is expected):

POST panorama-staging/_search
{
  "query": {
    "bool": {
      "must": {
        "query_string": {
          "query": "container_name: debug-receiver"
        }
      }
    }
  },
  "size": 10
}

POST *staging*-ivr-auth-master-*/_search
{
  "query": {
    "bool": {
      "must": {
        "query_string": {
          "query": "container_name: debug-receiver"
        }
      }
    }
  },
  "size": 10
}

The aliases are created by the index template, as seen here:

  "ivr-auth-staging": {
    "order": 100,
    "template": "*staging*-ivr-auth-master-*",
    "settings": {
    },
    "mappings": {},
    "aliases": {
      "panorama-staging": {}
    }
  }

Okay! I figured it out. Typing out all this information helped me narrow it down. I went into the Chrome Debug tools, grabbed the network request for the search, and this helped narrow it down. My search using the alias instead of the pattern was the following, which fails under the index and the pattern. Running this query yielded 0 results. It's because I set up the pattern with timestamp as the time field instead of @timestamp (we have some duplicated fields due to the breadth of micro-services logging).

{
  "version": true,
  "size": 500,
  "sort": [
    {
      "_score": {
        "order": "desc"
      }
    }
  ],
  "query": {
    "bool": {
      "must": [
        {
          "query_string": {
            "analyze_wildcard": true,
            "query": "container_name: debug-receiver"
          }
        },
        {
          "range": {
            "timestamp": {
              "gte": 1541083029539,
              "lte": 1541083929539,
              "format": "epoch_millis"
            }
          }
        }
      ],
      "must_not": []
    }
  },
  "_source": {
    "excludes": []
  },
  "aggs": {
    "2": {
      "date_histogram": {
        "field": "timestamp",
        "interval": "30s",
        "time_zone": "America/New_York",
        "min_doc_count": 1
      }
    }
  },
  "stored_fields": [
    "*"
  ],
  "script_fields": {},
  "docvalue_fields": [
    "@timestamp",
    "headers.created_utc",
    "start_utc",
    "t",
    "timestamp",
    "timestampUtc",
    "ts"
  ],
  "highlight": {
    "pre_tags": [
      "@kibana-highlighted-field@"
    ],
    "post_tags": [
      "@/kibana-highlighted-field@"
    ],
    "fields": {
      "*": {
        "highlight_query": {
          "bool": {
            "must": [
              {
                "query_string": {
                  "analyze_wildcard": true,
                  "query": "container_name: debug-receiver",
                  "all_fields": true
                }
              },
              {
                "range": {
                  "timestamp": {
                    "gte": 1541083029539,
                    "lte": 1541083929539,
                    "format": "epoch_millis"
                  }
                }
              }
            ],
            "must_not": []
          }
        }
      }
    },
    "fragment_size": 2147483647
  }
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.