Identifying number of logs per 5 minutes, over a week?

We ship AWS ALB logs to Kibana.

Each log has a lot of key value fields but the two we are interested in are client_ip and timestamp.

For example:
timestamp: Jan 23, 2025 @ 14:40:00.268

client_ip: 8.8.8.8

I am trying to work out the maximum number of API requests over a period of 5 minutes for an individual IP address.

I can go to Visualize -> Create Visualization -> Data Table -> Select the saved search

Then

Bucket -> Split rows -> Aggregation: Terms -> Field: client_ip.

I can then change the time frame, to 5 minutes, 60 minutes, between two timestamps etc to get the total per individual IP within this time frame.

Ideally I would like to search the last 7 days of logs, and find the maximum requests per 5 minutes per ip.

Is this possible?

Hi @user-27022024

You need to build an ES|QL chart for this kind of things (with the limit of 10000 rows per table) as regular charts will be bound to the number of buckets generated (5 minutes bucket in 7 days is about 3300+ rows per each IP potentially...).

A query like this could help:

FROM your_index | STATS count = count() by BUCKET(@timestamp, 5 minutes), clientip

Here's an example:

@user-27022024 Maybe just me, but I find the query a bit ambiguous.

So, my interpretation here was, for every IP that made 1+ requests over last 7 days, you want to know, for every IP, what was the most "active" 5 minute slot, and how active was it (in #requests)

So you would have a table with #uniq_count(clientip) rows, and 3 columns:

  1. IP
  2. timestamp (effectively a 5 minute bucket)
  3. the number of requests in that 5 minute bucket for that specific IP

BUT, I also see the text could be interpreted differently.

Can you confirm/clarify?

That is correct,

I'm looking to get a baseline of how many requests each IP is making over 5 minute periods.

The end goal is to have a rate limit. But to do this, I need to work out the current request pattern