I can imagine that with a low frequency the processing will be more spreaded over time and could be more efficient compared to a big load evrey hour.
Am I right ?
You are correct.
Since it doesn't matter if the data is a day old, I think it makes sense to look at if you want a spike of search traffic every hour or a constant load every 5m.
Since the frequency is 1h, assuming a flat rate of logs per hour, 6000000 / 24 ~= 250k docs for each hour. The max page search size of 30k will take 9 pages to process those docs. That 30k may be a large memory spike, and aggregating the 30k docs may cause a large search load. Lowering the page size will lower the memory and load, but it will take the transform longer to iterate through the docs. As long as the transform finishes its checkpoint within the hour (seems likely), it won't fall behind. That might help flatten out the impact (if that's desired).
Increasing the frequency (providing a smaller frequency number) would lower the amount of docs needed to search over, which will also reduce the memory and load.