Building a Syslog Infrastructure

Hello together,

we want to build a scalable Syslog Infrastructure with the ELK Stack.

We have some branch Offices and one head Office.

We would like only collect the syslog data in the branch offices and send them over an already standing vpn tunnel to the head office. If the vpn tunnel is down because the Internet Connection is broken the syslog instance in the branch office should buffer it till the internet connection and vpn tunnel is again standing. The head office should accept the collected syslog data from the branch offices and should give it to elasticsearch. Kibana should finally visualize the data.

Can anyone tell me which instances (filebeat, logstash shipper, logstash forwarder ..) in which sequence i need for that?

Best Regards

Daniel

You probably want filebeats on the hosts that are generating logs, they then push into a broker.
In the head office you can then have LS puling data from those brokers and pushing into ES.

1 Like