Search large amounts of data as quickly as possible

hey,
i think i need ur help:

the following use case: Monitor IDS,
I receive about 300 million records of IP addresses every day,
I need to search them for every single unique IP address as the last entry. (event)

My previous attempt ended with the following query;

{    
    "track_total_hits": true,
    "size": 0,
    "query": {
    "term": {
            "ip.keyword": "123456789"
        }
    }, 
    "sort": { "@timestamp": "desc"},
    "aggs": {.
                "top_group_hits": {.
                    "top_hits": {.
                        "sort": [
                            {
                                "@timestamp": {
                                    "order": { "desc".
                                }
                            }
                        ],
                        }, "size": 1
                    }
                }
    
    }
    }

From the query duration this is a disaster, do you have any suggestions for improvement?

Hi @error_701 Welcome to the community.

Perhaps you should look at creating a latest transform with the IP address as the unique key.

This will create an index that keep only the last entry for each unique IP.

Just a thought.. just did the same for big volume logs based on the host.name