Reduce size data logs

I have many Terabyte of logs. There is a solution to create logs aggregates data to have a report with large windows time and avoid to elaborate a big numbers of logs?
An example is this:
every day 1000 people access to a website and a single log is stored with time ip address etc.
It's would be useful to create an aggregate of this data to have an annual report with total access, peak of connection and top source ip address for example without analyze all logs.

You can use something like the aggregate filter in Logstash, or store the raw data in Elasticsearch and then use _rollup to do a similar thing.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.