I'm trying to understand how a filter is processed by the cluster. In particular, I have a metric that gives me the average time queries spend in the "query phase", and using the "took" time I can get an understanding of query phase + fetch phase.
When I run a filter-only query, I find that the "query phase" time is very small, i.e. <1ms even though the total "took" time is ~250ms. On the other hand, when I run a full-text search query the query phase time is larger (e.g. 30ms), even though the overall "took" time is much smaller (maybe 50-100ms).
I'm trying to get a better understanding of where the filter part of the query runs? Is it that it is not included in the query phase time at all or is it just that because filters are much more efficient the query phase time is much shorter? Is there some reason why the fetch time would be longer for a filter-only query? Both the queries I'm using to test this are returning the same size, fields, aggs, etc.