Hi,
I'm working with GKE and Elastic cloud. I have configured filebeat in my k8s cluster and is working fine, but is sending all the logs to one index, and I would like for some specific services send the logs to another index, thats can be possible?
I didn't found any annotations in filebeat to set up this, and I don't know if its possible or not.
This is my actual configmap
apiVersion: v1
kind: ConfigMap
metadata:
name: filebeat-config
namespace: logging
labels:
k8s-app: filebeat
data:
filebeat.yml: |-
logging.level: warning
filebeat.autodiscover:
providers:
- type: kubernetes
node: ${NODE_NAME}
hints.enabled: true
hints.default_config:
type: container
paths:
- /var/log/containers/*${data.kubernetes.container.id}.log
exclude_lines: '^[[:space:]]*$'
multiline.pattern: '^[[:space:]]+(at|\.{3})\b|^Caused by:'
multiline.negate: false
multiline.match: after
processors:
- add_cloud_metadata:
- add_host_metadata:
- decode_json_fields:
fields: ["message"]
target: "company"
overwrite_keys: true
- add_fields:
target: ''
fields:
gkeclustername: company-apps-stage
environment: stage
#setup.dashboards.beat: filebeat
#setup.dashboards.enabled: true
cloud.id: ${ELASTIC_CLOUD_ID}
cloud.auth: ${ELASTIC_CLOUD_AUTH}
setup.ilm.rollover_alias: "${INDEX_NAME_CLUSTER}"
output.elasticsearch:
hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
username: ${ELASTICSEARCH_USERNAME}
password: ${ELASTICSEARCH_PASSWORD}
---
In daemonset block I see the env var where the logs are sending but all logs...
kind: DaemonSet
....
....
env:
- name: ELASTICSEARCH_HOST
value: xxxxxxwest1.gcp.cloud.es.io
- name: ELASTICSEARCH_PORT
value: "9243"
- name: ELASTICSEARCH_USERNAME
value: xxxx
- name: ELASTICSEARCH_PASSWORD
value: xxxxx
- name: ELASTIC_CLOUD_ID
value: xxxxxxx
- name: ELASTIC_CLOUD_AUTH
value: xxxx:xxxx
- name: INDEX_NAME_CLUSTER
value: gke_stage
- name: NODE_NAME
valueFrom:
fieldRef:
fieldPath: spec.nodeName
Can you help me with any clue?
Thank you very much