Hi,
I have a persistent volume where logs from multiple docker container processes will get dropped. Those logs are potentially from multiple applications (related to each other with respect to their domain context).
Filebeat offers nice multiple prospectors configuration that allows me to collect logs from these different files.
However, since these logs are from different apps, their structure is potentially different and they need to go to different indices on the target (elasticsearch). I was hoping for a way to configure multiple elasticsearch outputs (similar to multiple prospectors) with their own filebeat configurations which will pull their own templates and ship logs appropriately to their own indices.
OR
I can differentiate the logs by type in prospectors configuration and ship them to the same index. But their log structure is different, so it needs the ability to configure multiple templates.
Is it possible ? Or whats the way to handle this situation?
Thanks !