Log collector solution

TL;DR: need log collector for file with minimum configuration, and support different conditions.

I have 3 servers that generate log file daily with size about 12GB (12*3=36GB)

How can I gather these files on centralize log server.

server1 >
server2 > centralize log
server3 >

FYI1: rsyslog, filebeat, syslog-ng, fluentd, ... are available solution but I can't decide which one is more suitable for this issue.

FYI2: raw data is important , and doesn't be missed. (don't want to clean data exact log file is important for me)

FYI3: like splunk forwarder whenever servers or network down, after issue resolve it will continuously send data. (AFAIK rsyslog use tracker file when server stopped and try to send remain file after service start again but while service down new file create with different name and structure can't track) (but splunk forwarder can handle this situation even when forwarder service on servers down, whenever start it can discover any file on that path)

FYI4: here is the path of my log /opt/log/*
different file, with different name may create here , I need to dynamically everything on this path send to the centralize log.

Any idea?


Have you looked at elastic agent which includes filebeat for ease of management?

Elastic agent filebeat can harvests all files in the path /var/log/*.log , which means that Filebeat will harvest all files in the directory /var/log/ that end with .log .

Ok, it send files, line by line to server and store them in an index. Raw file with exact name miss on server.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.