I am attempting to determine storage requirements for logging various systems. It's convenient when all logs for the system in question are going into one index e.g. all firewall logs going into a firewall index, I can just look at the daily index size and say firewall logs are ~x GB/day.
What I have not figured out is how to tell e.g. daily size of linux system logs sent to a shared index (e.g. shared with application logs).
e.g. questions like average daily disk usage within elasticsearch per host of /var/log/messages and /var/log/secure
Is there some way to query something like that, or would you need to run a query, count the documents and multiply by some "guesstimated average size of log message" factor?
I do not have the entire fleet going in, just some subset. We are negotiating with elastic currently and need to produce some sizing data for cost estimates.