Filebeat - Parse json output

I realize this has been asked numerous times but I'm having a hard time getting Filebeat 6.2.4 to send json logs to Elasticsearch (without Logstash) in a K8s environment. For example for a given json log:


The entry in Kibana looks like this:

The json output is being logged under the log field. I need the fields to be parsed and placed in the root document, and my understanding was json.keys_under_root: true would do the trick, but doesn't seem to. Below is my full config:

apiVersion: v1
kind: ConfigMap
  name: filebeat-config
  namespace: kube-system
    k8s-app: filebeat "true"
  filebeat.yml: |-
        path: ${path.config}/prospectors.d/*.yml
        reload.enabled: false
        path: ${path.config}/modules.d/*.yml
        reload.enabled: false
    logging.level: debug
      hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:443}'] filebeat
      template.path: filebeat.template.json

apiVersion: v1
kind: ConfigMap
  name: filebeat-prospectors
  namespace: kube-system
    k8s-app: filebeat "true"
  kubernetes.yml: |-
    - type: log
        - /var/lib/docker/containers/*/*.log
      json.add_error_key: true
      json.keys_under_root: true
      json.message_key: log
        - add_kubernetes_metadata:
            in_cluster: true
            namespace: ${POD_NAMESPACE}

Any help in the right direction would be appreciated.

I see you have configured your docker containers as path, but type log. Docker itself embeds logs in JSON, which gives you JSON in JSON.

Have you tried the docker input type instead of 'log' ?

Actually turns out my log file had multi-lines in it. adding

multiline.pattern: '^{'
multiline.negate: true
multiline.match: after

as well as

- decode_json_fields:
    fields: ["log"]
    process_array: false
    max_depth: 5
    target: ""
    overwrite_keys: true

to the list of processors fixed the issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.