Use user.name.keyword field in a kibana visualisation

Hi,

I want to be able to unique.count on user.name field from filebeat-* indexes with system auth module enabled.
When i do this, i get a kibana error "* of * shards failed", and when i go to "response" tab, i can read

"Fielddata is disabled on text fields by default. Set fielddata=true on [user.name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."

So i want to be able to query user.name.keyword instead of just user.name but i can't find it in field selector.

When i query my mappings i found that keyword exists

GET filebeat-*/_mapping/field/user.name

gives

{
  "filebeat-2020.03.21" : {
    "mappings" : {
      "user.name" : {
        "full_name" : "user.name",
        "mapping" : {
          "name" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          }
        }
      }
    }
  },

So why can't i query that user.name.keyword in kibana visualisation ?

Using user.full_name doesn't give any error, but data seems empty, i get "0" entries.

I can't find any keyword field, but my template contains a lot of keyword fields, thousands of it...

Hi!

In Discover page, can you query with that field?

In order to debut this one, I would try to query directly the Elasticsearch using console in Dev Tools of Kibana to see if the field is actually populated. Also maybe reloading Kibana page could of help in some cases.

Hi,
Thanks for your answer.
I reloaded kibana page many times.
No, i can't query user.name.keyword in discover page, but i can query user.name.

Here a query with dev tools kibana page

GET filebeat-*/_search
{
  "size" : 1,
  "query": {
    "match": {
    "system.auth.ssh.method" : "password"
    }
  }
}

returns

{
  "took" : 377,
  "timed_out" : false,
  "_shards" : {
    "total" : 14,
    "successful" : 14,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 10000,
      "relation" : "gte"
    },
    "max_score" : 0.607162,
    "hits" : [
      {
        "_index" : "filebeat-2020.03.24",
        "_type" : "_doc",
        "_id" : "vorZCXEBYvT3hcyRzhXx",
        "_score" : 0.607162,
        "_source" : {
          "agent" : {
            "hostname" : "prodpeda-x2go-bionic2",
            "id" : "d8576c93-d14d-4e1a-b36f-7d99314ad23e",
            "ephemeral_id" : "c2f8ddfb-ba5e-4758-b3ae-4f06c91e19c3",
            "type" : "filebeat",
            "version" : "7.6.1"
          },
          "process" : {
            "name" : "sshd",
            "pid" : 22512
          },
          "log" : {
            "file" : {
              "path" : "/var/log/auth.log"
            },
            "offset" : 10788348
          },
          "source" : {
            "geo" : {
              "continent_name" : "Europe",
              "region_iso_code" : "FR-34",
              "city_name" : "Montpellier",
              "country_iso_code" : "FR",
              "region_name" : "Hérault",
              "location" : {
                "lon" : 3.8772,
                "lat" : 43.6109
              }
            },
            "as" : {
              "number" : 2065,
              "organization" : {
                "name" : "Renater"
              }
            },
            "port" : 41156,
            "ip" : "162.38.151.131"
          },
          "fileset" : {
            "name" : "auth"
          },
          "input" : {
            "type" : "log"
          },
          "@timestamp" : "2020-03-24T01:03:19.000+01:00",
          "system" : {
            "auth" : {
              "ssh" : {
                "method" : "password",
                "event" : "Failed"
              }
            }
          },
          "ecs" : {
            "version" : "1.4.0"
          },
          "service" : {
            "type" : "system"
          },
          "@version" : "1",
          "host" : {
            "hostname" : "prodpeda-x2go-bionic2",
            "os" : {
              "kernel" : "4.15.0-91-generic",
              "codename" : "bionic",
              "name" : "Ubuntu",
              "family" : "debian",
              "version" : "18.04.4 LTS (Bionic Beaver)",
              "platform" : "ubuntu"
            },
            "containerized" : false,
            "name" : "prodpeda-x2go-bionic2",
            "id" : "505c058a50cd4964af2cfddda8f7b5e4",
            "architecture" : "x86_64"
          },
          "event" : {
            "timezone" : "+01:00",
            "module" : "system",
            "action" : "ssh_login",
            "type" : "authentication_failure",
            "category" : "authentication",
            "dataset" : "system.auth",
            "outcome" : "failure"
          },
          "user" : {
            "name" : "deploy"
          }
        }
      }
    ]
  }
}

Thing is that it is possible that i started logstash before importing template...
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-template.html

As my nodes are not directly connected to elasticsearch, i used manual export/import
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-template.html#load-template-manually-alternate

Is there a way to force elasticsearch to apply template for next indexes ?
Then use reindex api on previous indexes ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.