Generic search between docs added with Logstash or Filebeat

I don't have control over how the JSON logs I'm trying to search get added to ElasticSearch, but its always Filebeat or Logstash. Something I've noticed, is for a specific field Filebeat has the mapping:

            "signature": {
              "type": "keyword",
              "ignore_above": 1024
            },

but Logstash will use the mapping:

            "signature": {
              "type": "text",
              "fields": {
                "keyword": {
                  "type": "keyword",
                  "ignore_above": 256
                }
              }
            },

So for Logstash users I can use a basic query_string,

"query_string": {
    "query": "HUNT"
}

and I'll get results where the signature query contains the string "HUNT".

However, for Filebeat I don't get substring matches. It will only match on the complete string, and at least in my test setup I also have to add the "fields" parameter to "query_string" otherwise I get an error about "field expansion matches too many fields".

Is there a way to structure the query where it will match on a portion of the value, and work across the document indexed by Logstash OR Filebeat?

Thanks!

You need to have a consistent mapping IMO.
Check that the templates are the same for all index names.

Within a given Elastic Search installation the mapping will be consistent. I'm more curious about making a query statements that will give me what I want without having to probe for the mapping in use.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.