I have found an issue with the Cisco Umbrella integration.
I observed that all the events being processed into ELK had a timestamp of when the document was ingested into ELK and not the time/date of the event from Umbrella.
Some looking found the issue with the Pipeline this integration uses.
You will note that the CSV parserver read the event date/time into a field of "cisco.umbrella._tmp.time" and then towards the end of the pipeline it does a date conversion to parse the date from the "cisco.umbrella._tmp.time" to the "@timestamp" field if the "cisco.umbrella._tmp.time" exists.
However, earlier in the pipeline there is a script function that does the following:
"ctx.cisco.umbrella._tmp = new HashMap();"
You can see this at line 241 here: https://github.com/elastic/integrations/blob/main/packages/cisco_umbrella/data_stream/log/elasticsearch/ingest_pipeline/default.yml
As the script overwrites _tmp to be a HashMap the time is then dropped.
I have manually addressed this by modifying the pipeline to put the time into a different field initially e.g. "cisco.umbrella.time" and then use that field in the date conversion.