Storing rollup externally

How difficult would it be to implement a process that would generate daily rollups in a separate document store, such as mongoDB.

I've got a system that processes log files and stores them as aggregated rollups in mongoDB, it doesn't however manage transport or archiving, nor does it allow for drilling down into advanced metrics, so I'm trying to incorporate logstash/filebeat and elacticsearch into the pipeline. I've been able to get the logs into elasticsearch, and I can get the logs into mongoDB, but I would end up processing the log files twice.

What options are available to generate aggregate from filebeat, logstash or elasticsearch and have certainty that the logs are accurate?

What do you mean by rollup here?

Stored aggregations in preset time intervals.

You can't have accurate rollups, aggregations of this manner are approximations based on averages of the data.

Also if you want to do this why not store it in ES?

can you clarify what you mean?

You can't have accurate rollups, aggregations of this manner are approximations based on averages of the data

if my aggregations are just running totals based on several parameters. Why would these not be accurate?

why not store it in ES?

The idea is that the info stored in ES allows advanced data extraction in rare cases.
The rollups are totals that can be stored on a separate server, the ES server would then do not require the kind of high availability that is expected of the server.