Kubernetes - Multiple cluster on same Elastic installation

Hello guys!

We are currently using Elastic to monitor our stack.

We do have multiple GKE (Google Kubernetes Engine) cluster, and I would like to have their logs sent to the same Elastic Instance.

Is there a recommended way to create something like a "namespace" or something similar?

Our goal is to be able to query by "cluster"

Thanks!

Hi @dekim,

that's not an uncommon scenario. What worked for me so far is to encode the cluster identifier in the index names (such as logs-cluster-${cluster_id}) as well as add a field upon ingestion that contains the cluster identifier. If you're using Filebeat or Elastic Agent you could achieve the latter using the add_kubernetes_metadata processor, which can add labels of the monitored resource to each document.

This then provides the flexibility to limit the queried indices to a cluster or query all of logs-* but filter or aggregate over the cluster identifier in the labels.

Awesome!

I will try this out :slight_smile: