Hi!
I have a question about parsing JSON log messages produced by Kubernetes deployments. I've already seen this thread, this page in the docs, and this page.
None of those seem relevant however. The problem is that some apps in our cluster log in JSON and some don't. So I need a way to tell filebeat to only try to parse JSON logs for a certain set of apps.
I tried adding this config to filebeat.yml
however it leads to errors parsing log messages which aren't JSON-formatted:
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 10
overwrite_keys: false
Here are the ConfigMaps for our filebeat setup:
apiVersion: v1
kind: ConfigMap
metadata:
name: filebeat-config
namespace: kube-system
labels:
k8s-app: filebeat
data:
filebeat.yml: |-
filebeat.config:
inputs:
# Mounted `filebeat-inputs` configmap:
path: /usr/share/filebeat/inputs.d/*.yml
# Reload inputs configs as they change:
reload.enabled: false
modules:
path: /usr/share/filebeat/modules.d/*.yml
# Reload module configs as they change:
reload.enabled: false
output.elasticsearch:
hosts: ['...:9200']
username: ...
password: ...
---
apiVersion: v1
kind: ConfigMap
metadata:
name: filebeat-inputs
namespace: kube-system
labels:
k8s-app: filebeat
data:
kubernetes.yml: |-
- type: docker
containers.ids:
- "*"
processors:
- add_kubernetes_metadata:
in_cluster: true
Is there a way that I can specify different processors depending on the source container/app? That would solve the problem because then I could add the JSON processor only for certain apps.