Is Date filter costly?

I have one log file feeding is about 1-2 mins lagging behind of current time (view from kibana).
The feeding path is like this: logstash shipper -> redis->logstash indexer -> es
and indexer has date filter configured as below, where EventTime is from grok
date {
match => ["EventTime", "HH:mm:ss.SSS YYYY-MM-dd"]
remove_field => ["EventTime"]
}

I'm not sure which part caused the deplay, so what i did is removed the above date filter from indexer (note i never removed the grok) so that i'll have two time: @timestamp and EventTime. so that i can check whether shipper is the one caused the problem. however, surprisingly, after I made the above change, it's no longer lagging (actually lags about 10s, but it's acceptable). Does this mean date filter is actually quite costly? is it recommended to use?

Thank you.

No, the date filter it quite fast. Are the system clocks of all machines in sync?

yes, they're synced

I have about 10K log messages per 30s, is this considered a lot for date filter?

The @timestamp is generated by shipper, correct?

I have about 10K log messages per 30s, is this considered a lot for date filter?

So about 300 msg/s. That's not a lot.

The @timestamp is generated by shipper, correct?

If a @timestamp field doesn't exist Logstash will add it. So yes, it'll be added by the shipper or whatever Logstash instance touches the message first.

what can I do to further troubleshoot this problem?