I am trying to run filebeat daemonset in Kubernetes cluster. It is expected to honor the multiline log entries and also parse json log entries.
This is the config snippet:
filebeat.autodiscover:
providers:
- type: kubernetes
node: ${NODE_NAME}
hints.enabled: true
hints.default_config:
type: container
paths:
- /var/log/containers/*${data.kubernetes.container.id}.log
multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
multiline.negate: true
multiline.match: after
multiline.max_lines: 50
multiline.timeout: 3s
processors:
- add_cloud_metadata:
- drop_event:
when:
or:
- equals:
kubernetes.namespace: "kube-system"
- equals:
kubernetes.namespace: "metallb-system"
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 1
target: ""
overwrite_keys: true
Everything works as desired except for one issue: some json logs are being merged together when this is not expected:
Does anybody know how to deal with it?