I don't have control over how the JSON logs I'm trying to search get added to ElasticSearch, but its always Filebeat or Logstash. Something I've noticed, is for a specific field Filebeat has the mapping:
"signature": {
"type": "keyword",
"ignore_above": 1024
},
but Logstash will use the mapping:
"signature": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
So for Logstash users I can use a basic query_string,
"query_string": {
"query": "HUNT"
}
and I'll get results where the signature query contains the string "HUNT".
However, for Filebeat I don't get substring matches. It will only match on the complete string, and at least in my test setup I also have to add the "fields" parameter to "query_string" otherwise I get an error about "field expansion matches too many fields".
Is there a way to structure the query where it will match on a portion of the value, and work across the document indexed by Logstash OR Filebeat?
Thanks!