Logstash on k8s

hi everyone,

this is more of a design question:

what would be the best way to handle logstash as it is monitoring several different files (different logs) in a kubernetes environment?

Say we want to monitor 6 different files.

One pod - one logstash instance with multiple pipelines and tags for handling each different file

Or a different pod for each file with its own logstash handling a single file.

The later one seems more elegant (if the pod dies, k8s brings it back up) but still the cluster would have to tolerate several logstashs (and their jvm).

I know this is kinda open-ended but I'm interested in other peoples thoughts on the matter.

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.