Hi Currently I'm using self-hosted elastic stack but evaluating elastic cloud.
I collect all the logs from multiple k8s clusters in a standard way with filebeat kubernetes module. There are application and nginx logs. For now I have logstash as filebeats output. Then I match logs with grok, tag them and send to different idices, eg.: filebeat-nginx-*
, filebeat-app-*
, filebeat-other-*
. I thought about dropping logstash and don't worry about managing it if there was a way to filter those logs this way with filebeat or ingest node pipelines.
I thought about using indices
in filebeat config along with some processors, but it's hard to do proper filtering content of a message with when.contains
. There's also when.regexp
, but I'd like to ask if anyone had the same problem and solved it in an elegant way? Maybe my approach to indexing should be completely different? How do you handle multiple k8s clusters logs, where every log goes to a common place (stdout, stderr)?