Has anyone built an end to end process on log forwarding to log pruning?
Or, does everyone keep several x days of logs on the source, just in case something happens and you have to re-forward. Then implement watcher rules to ensure that data is flowing in and call it good?
Scenario.
Filebeats collects logs from a directory. It sends the data to logstash/elastic. Data validation is done in elastic, and the log is then pruned on the source.