I was challenged to make an architecture to get files from a Unix Server and make a dashboard to display errors and also make some alerts.
I made a survey and found the Elastic Stack.
My plan is to use File Beats to get the files from the Unix Server and send it to Logstash that is in another server, make an analysis of the file in the Logstash through the filter thing then put it to Elasticsearch then Kibana.
A doubt then appeared, I imagine the FileBeat should be deployed in the server where the logs are (Which is the same server that handles all the execution of other important stuff).
With that in mind and also the fact that the logs can reach near 1GB and the file keeps changing (it updates itself with new information about Errors). How do FileBeat makes de shipping to Logstash? Do it send the whole file once or it just updates the differences in the file?
Sorry about writing too much.
Hope you can help me.