Aggregate / reformat search results from buckets into new fields

We are using ELK 5.6.8 and I am trying to setup search for a watcher that would periodically go through the logs and do the following:

  1. search for a certain string in a field
  2. aggregate the results per host
  3. filter out the hosts in the bucket that have the count less than threshold
  4. create the fields that contain
    a. the number of buckets
    b. content of the buckets formatted into a string

What I have so far:

GET my-logs-*/_search
{
  "query": {
    "bool": {
      "must": [
        { "query_string": { "fields": ["message_json.msg"], "query": "'something happened'"} }
      ],
      "filter": {
        "range": {
          "@timestamp": {
            "gt": "now-600m"
          }
        }
      }
    }
  },
  "aggregations": {
    "count_per_host": {
      "terms": {
        "field": "message_json.hostname.keyword",
        "order" : { "_count" : "desc" },
        "min_doc_count": 15
      }
    }
  }
}

and I get the result like that:

{
  "took": 82,
  "timed_out": false,
  "_shards": {
    "total": 99,
    "successful": 99,
    "skipped": 0,
    "failed": 0
  },
  "hits": {...},
  "aggregations": {
    "count_per_host": {
      "doc_count_error_upper_bound": 0,
      "sum_other_doc_count": 0,
      "buckets": [
        {
          "key": "host-79gzj",
          "doc_count": 20
        },
        {
          "key": "host-gph59",
          "doc_count": 18
        }
      ]
    }
  }
}

But I could not figure out how to do the last aggregation to get the fields:

"hosts_number": 2   <- aggregations.count_per_host.buckets.length()
"hosts_string": "host-79gzj matched 20 times, host--gph59 matched 18 times"

I have a hunch that script should be used, but I cannot get it working with resulted buckets... :disappointed: Any ideas are very much appreciated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.