Enable filebeat caching during unavailablity

Hello,

I hope you and your loved ones are safe and healthy.

I am running a cluster that collects logs from sources on the internet. I need to enable caching of logs in case the next hop is not reachable as dropping logs is detrimental to my project.
Following is my architecture:

Various log sources (mostly running Linux) send logs using Filebeat to my homelab which are collected by Logstash.

A. This is where the first unavailability can occur. As I use home ISP and do not commercial agreement, there are availability issues. How do I enable caching of logs (up to 48 hours) at filebeat in case the next hop (logstash hosted in my homelab) is not available.

I have enabled deduplication in filebeat + logstash using: Deduplicate data | Filebeat Reference [8.3] | Elastic

Maybe this?

Internal Queue

1 Like

Thank you for this.

I initially thought the default enabled setting of 10G would take care. However, I see that it needs to be specified. I will add this and let the community know once I simulate an outage. Thank you very much. :slight_smile:

@ulisses , thank you very much. I simulated a downtime yesterday for 6 hours and there were no log loss. I will simulate a longer one today to validate the settings. I've marked it as a solution.

@ulisses - It works. Thank you very much! Amazing :slight_smile:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.