I have multiple K8s pods that I want to monitor with a filebeat daemon. I want to annotate some pods using "json_logs: true" to have filebeat apply the json_decode module to the message field before shipping the data to elasticsearch. I am using the helm chart to deploy the daemon, and it comes up successfully. However I am not getting any logs pushed to elasticsearch. I have tried every combination I know of and nothing works. Any help is appreciated. My filebeat values file is included below.
filebeat.yml: |
filebeat.autodiscover:
providers:
- type: kubernetes
in_cluster: true
templates:
- condition:
contains:
kubernetes.annotations.json_logs: "true"
config:
- type: container
tail_files: true
containers.ids:
- "${data.kubernetes.container.id}"
paths:
- /var/log/containers/*${data.kubernetes.container.id}*.log
processors:
- add_kubernetes_metadata:
in_cluster: true
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 10
overwrite_keys: false
add_error_key: true
output.elasticsearch:
username: '${ELASTICSEARCH_USERNAME}'
password: '${ELASTICSEARCH_PASSWORD}'
protocol: https
hosts: ["HOSTNAME"]
extraEnvs:
- name: 'ELASTICSEARCH_USERNAME'
valueFrom:
secretKeyRef:
name: elastic-credentials
key: username
- name: 'ELASTICSEARCH_PASSWORD'
valueFrom:
secretKeyRef:
name: elastic-credentials
key: password