How to specify the index to send the logs of specific microservice

Hi,

I'm working with GKE and Elastic cloud. I have configured filebeat in my k8s cluster and is working fine, but is sending all the logs to one index, and I would like for some specific services send the logs to another index, thats can be possible?

I didn't found any annotations in filebeat to set up this, and I don't know if its possible or not.

This is my actual configmap

apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-config
  namespace: logging
  labels:
    k8s-app: filebeat
data:
  filebeat.yml: |-
    logging.level: warning
    filebeat.autodiscover:
      providers:
        - type: kubernetes
          node: ${NODE_NAME}
          hints.enabled: true
          hints.default_config:
            type: container
            paths:
              - /var/log/containers/*${data.kubernetes.container.id}.log
            exclude_lines: '^[[:space:]]*$'
            multiline.pattern: '^[[:space:]]+(at|\.{3})\b|^Caused by:'
            multiline.negate: false
            multiline.match: after
    processors:
      - add_cloud_metadata:
      - add_host_metadata:
      - decode_json_fields:
          fields: ["message"]
          target: "company"
          overwrite_keys: true          
      - add_fields:
          target: ''
          fields:
            gkeclustername: company-apps-stage
            environment: stage

    #setup.dashboards.beat: filebeat
    #setup.dashboards.enabled: true

    cloud.id: ${ELASTIC_CLOUD_ID}
    cloud.auth: ${ELASTIC_CLOUD_AUTH}

    setup.ilm.rollover_alias: "${INDEX_NAME_CLUSTER}"

    output.elasticsearch:
      hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
      username: ${ELASTICSEARCH_USERNAME}
      password: ${ELASTICSEARCH_PASSWORD}
---

In daemonset block I see the env var where the logs are sending but all logs...

kind: DaemonSet
....
....
env:
        - name: ELASTICSEARCH_HOST
          value: xxxxxxwest1.gcp.cloud.es.io
        - name: ELASTICSEARCH_PORT
          value: "9243"
        - name: ELASTICSEARCH_USERNAME
          value: xxxx
        - name: ELASTICSEARCH_PASSWORD
          value: xxxxx
        - name: ELASTIC_CLOUD_ID
          value: xxxxxxx
        - name: ELASTIC_CLOUD_AUTH
          value: xxxx:xxxx
        - name: INDEX_NAME_CLUSTER
          value: gke_stage
        - name: NODE_NAME
          valueFrom:
            fieldRef:
              fieldPath: spec.nodeName

Can you help me with any clue?

Thank you very much

That would normally be handled in the output.elasticsearch section, but you're using ILM which, unfortunately, complicates things as Filebeat cannot handle variables in the ILM output section.

I don't know of another solution here off the top of my head, hopefully someone else can chime in.

1 Like

Hi @David_Oceans, if I understood correctly you would like to be able to send log lines to different indexes based on their content? So you have one single log "stream" where multiple services send logs and you want to be able to specify destination indexes based on the log message content or attributes.

When you say "service" are your referring to specific pods? i.e. you have a nginx pod and a mysql pods and would like their log to be sent to different indexes.

Have I understood this right?
Thanks

1 Like

Yes, I mean with "Services" a differents pods, exactly what you said.

Do you know the best way? I can you a simple annotations in thats pods/deployments of the services to sent to specific index for that service?

Thank you very much

Hello @David_Oceans , sorry for the delay on this, it slipped through my notifications.

We had a discussion about this internally, and there should be a way to tackle this.

First I suggest you to have a look at this other discussion, as it seems related: Send k8s logs with Filebeat to separate indices - #2 by jbury

Otherwise this is possible using logstash, between filebeat and Elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.