Aggregation to find new weekly logs

I've been trying to figure this out, but I seem to keep getting stuck and would love a nudge in the right direction.

I have logs coming in each week, some logs have been occurring for the past several weeks, and others started this week and I'd like to know which logs are in which of the two categories. This is what I have so far:

GET /logst*/_search/
{
  "size":0,
  "query": {
    "bool": {
      "filter": {
        "range": {
          "@timestamp": {
            "gte": "now-3w/d"
          }
        }
      }
    }
  },
  
      "aggs": {
        "JIRA": {
          "terms": {
            "field": "ErrorCodeUnique.keyword",
            "size": 1000,
            "order": {
              "FirstOccurrence": "desc"
            }
          },
          "aggs": {
            "FirstOccurrence": {
              "min": {
                "field": "@timestamp"
              }
          }
        }
      }
    }
  }

This has a sample result of:

"hits": {
    "total": ######,
    "max_score": 0,
    "hits": []
  },
  "aggregations": {
    "JIRA": {
      "doc_count_error_upper_bound": 0,
      "sum_other_doc_count": ######,
      "buckets": [
        {
          "key": "E-11111 - 111111",
          "doc_count": 1,
          "FirstOccurrence": {
            "value": 1504208992000,
            "value_as_string": "2017-08-31T19:49:52.000Z"
          }
        },
        {
          "key": " E-22222- 222222",
          "doc_count": 3,
          "FirstOccurrence": {
            "value": 1504208773000,
            "value_as_string": "2017-08-30T19:46:13.000Z"
          }
        },
        {
          "key": "E - 1234 - 1234",
          "doc_count": 1,
          "FirstOccurrence": {
            "value": 1504208562000,
            "value_as_string": "2017-08-21T19:42:42.000Z"
          }
        },
        {
          "key": "E-1144 - 01176",
          "doc_count": 1,
          "FirstOccurrence": {
            "value": 1504207878000,
            "value_as_string": "2017-08-01T19:31:18.000Z"
          }
        }

So given this, I'd like to the somehow filter those aggregated FirstOccurrence dates so that I only see dates from now-1w but I can't figure out how. I've been trying with bucket_scripts and selectors, but to no avail. I also tried to just use

       range": {
            "field": "@timestamp",
            "ranges": [
              {
                "from": "now-1d/d",
                "to": "now"
              }
            ]
          }

But I don't need to filter the timestamp field, I need to filter the FirstOccurrence aggregation.

Can anyone give any guidance? :blush:

A bucket_selector aggregation should work here. Take a look at the example below. This bucket_selector aggregation scripts checks whether the current time in milliseconds minus the result of the FirstOccurrence aggregation is less than 604800000 (the number of milliseconds in one week). Only those buckets with a result from the last week are returned.

Make sure the time is set identical on all the servers hosting your nodes, otherwise you will get inconsistent results (System.currentTimeMillis() is calculated on the node holding the shard).

GET /logst*/_search/
{
  "size": 0,
  "query": {
    "bool": {
      "filter": {
        "range": {
          "@timestamp": {
            "gte": "now-3w/d"
          }
        }
      }
    }
  },
  "aggs": {
    "JIRA": {
      "terms": {
        "field": "ErrorCodeUnique.keyword",
        "size": 1000,
        "order": {
          "FirstOccurrence": "desc"
        }
      },
      "aggs": {
        "FirstOccurrence": {
          "min": {
            "field": "@timestamp"
          }
        },
        "last_one_week": {
          "bucket_selector": {
            "buckets_path": {
              "FirstOccurrence": "FirstOccurrence"
            },
            "script": "System.currentTimeMillis() - params.FirstOccurrence < 604800000"
          }
        }
      }
    }
  }
}
1 Like

Awesome! I was starting to think it wasn't possible to do this but this is a neat little way to go about this!

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.