We have two different ELK servers which are used to analyse logs from our own application, one is in-house and the other is on the customers site. We have different pipelines for the application itself and then also for the various application services. Example we have application.log pipeline, then application.service1.log, application.service2.log and so on in the
The in-house setup gathers the logs from a share after they are moved there by a script and are static in that they are no longer being changed or updated. Filebeat is configured to push the logs into logstash.
On the customer site we have multiple application servers with filebeat running, pushing the the logs directly to logstash over the network to the ELK server. Some of the logs are constantly being amended and added to over time
We have noticed that the logs are being indexed differently on the customer site.
On the customer site there appears to be some "cross contamination" between the application.log and application.server1.log logs when we see log lines in the message field that shouldn´t be there.
Our question is, should there be an difference between sending static logs vs a live feed to logstash and if so what filebeat input options (or elasticesearch output?) should be applied in this "live feed scenario"
Apologies if some of the terminology is off its because I am new to ELK stack.
Thanks in advance