Strange problem finding any documents/logs, other than from a specific container (Kubernetes)

Hello, I am using Elasticsearch to index logs from Kubernetes. I have the problem, that in Kibana (6.8.6) discover, visualize, etc. I can only see the logs coming from one pod, namely the kibana pod itself, in namespace logging.

For example, I take the index fluentbit-2020.09.28, which I see has around 3090646 indexed documents:

[ec2-user@ip-10-105-6-233:~ ] es-api.sh _cat/indices?v
health status index                           uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   fluentbit-2020.09.28            I_lfFz-ISlKA5EGhNutvYg   5   1    3090646            0        5gb          2.5gb
 [...]

With a search I try to see from which kubernetes namespaces I am receiving logs, and I see that there certainly are documents there in other namespaces than logging:

GET fluentbit-2020.09.28/_search
{
    "aggregations": {
        "namespaces": {
            "terms": {
              "field": "kubernetes.namespace_name.keyword"
            }
        }
    }
}
#Response:
{
  "took" : 182,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "skipped" : 0,
    "failed" : 0
  },

  [...]

  "aggregations" : {
    "namespaces" : {
      "doc_count_error_upper_bound" : 0,
      "sum_other_doc_count" : 4,
      "buckets" : [
        {
          "key" : "acc",
          "doc_count" : 489132
        },
        {
          "key" : "trn",
          "doc_count" : 433474
        },
        {
          "key" : "tst",
          "doc_count" : 389724
        },
        {
          "key" : "dev1",
          "doc_count" : 387572
        },
        {
          "key" : "dev3",
          "doc_count" : 357916
        },
        {
          "key" : "dev2",
          "doc_count" : 357874
        },
        {
          "key" : "acc-freetext",
          "doc_count" : 335400
        },
        {
          "key" : "kube-system",
          "doc_count" : 194104
        },
        {
          "key" : "monitoring",
          "doc_count" : 64900
        },
        {
          "key" : "logging",
          "doc_count" : 8512
        }
      ]
    }
  }
}

But the problem is that in Kibana there is nothing to be seen, neither in Discover nor in Visualize:

I tried to find the XHR which Kibana makes at the _msearch endpoint, and call it:

POST _msearch
{"index":"fluentbit-2020.09.28","ignore_unavailable":true,"preference":1601279552404}
{"version":true,"size":500,"sort":[{"@timestamp":{"order":"desc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"30m","time_zone":"Europe/Berlin","min_doc_count":1}}},"stored_fields":["*"],"script_fields":{},"docvalue_fields":[{"field":"@timestamp","format":"date_time"},{"field":"@timestamp-es","format":"date_time"},{"field":"data.createdOn","format":"date_time"},{"field":"data.editedOn","format":"date_time"},{"field":"data.expectedEnd","format":"date_time"},{"field":"data.expiringDate","format":"date_time"},{"field":"data.messages.createdOn","format":"date_time"},{"field":"expectedEnd","format":"date_time"},{"field":"kubernetes.annotations.kubectl_kubernetes_io/restartedAt","format":"date_time"},{"field":"request_received_at","format":"date_time"},{"field":"response_sent_at","format":"date_time"},{"field":"time","format":"date_time"},{"field":"timestamp","format":"date_time"},{"field":"ts","format":"date_time"},{"field":"written_at","format":"date_time"}],"query":{"bool":{"must":[{"range":{"@timestamp":{"gte":1601193592568,"lte":1601279992568,"format":"epoch_millis"}}}],"filter":[{"bool":{"should":[{"match":{"kubernetes.namespace_name":"logging"}}],"minimum_should_match":1}}],"should":[],"must_not":[]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"*":{}},"fragment_size":2147483647},"timeout":"30000ms"}

And I get actual results with the above query. But if I then substitute in the term "match": {"kubernetes.namespace_name":"logging"}, "logging" with "monitoring", I get no results:

{
  "responses" : [
    {
      "took" : 4,
      "timed_out" : false,
      "_shards" : {
        "total" : 5,
        "successful" : 5,
        "skipped" : 0,
        "failed" : 0
      },
      "hits" : {
        "total" : 0,
        "max_score" : null,
        "hits" : [ ]
      },
      "aggregations" : {
        "2" : {
          "buckets" : [ ]
        }
      },
      "status" : 200
    }
  ]
}

So the data seems to be in elasticsearch but I somehow cannot retrieve it?

I cannot think of a reason, other than that I had indeed in the past searched for "kubernetes.container_name: kibana" in Discover. To me it seems as if this search has been persisted and I can no longer get rid of it. I have in the meantime also deleted the .kibana_1 index and restarted Kibana, to no avail.

This was the result of a badly configured index-pattern in kibana, where the wrong field had been chosen as timestamp.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.