I hope you and your loved ones are safe and healthy.
I am running a cluster that collects logs from sources on the internet. I need to enable caching of logs in case the next hop is not reachable as dropping logs is detrimental to my project.
Following is my architecture:
Various log sources (mostly running Linux) send logs using Filebeat to my homelab which are collected by Logstash.
A. This is where the first unavailability can occur. As I use home ISP and do not commercial agreement, there are availability issues. How do I enable caching of logs (up to 48 hours) at filebeat in case the next hop (logstash hosted in my homelab) is not available.
I have enabled deduplication in filebeat + logstash using: Deduplicate data | Filebeat Reference [8.3] | Elastic