Filebeat - Parse json output

I realize this has been asked numerous times but I'm having a hard time getting Filebeat 6.2.4 to send json logs to Elasticsearch (without Logstash) in a K8s environment. For example for a given json log:

{"@timestamp":"2019-04-23T16:22:33.979906989Z","application":"my-app","correlation_id":"55189165-dafa-4b22-a720-adfa9eafc6e6","data_version":2,"description":"","envoy_data":{"downstream_local":{"address":"127.0.0.1","port":12345,"protocol":"TCP"},"downstream_remote":{"address":"127.0.0.1","port":56004,"protocol":"TCP"},"listener":"ingress","request_received_time":"2019-04-04T16:22:33.979Z","request_start_time":"2019-04-04T16:22:33.979Z","response_duration_ms":0,"response_received_time":"2019-04-04T16:22:33.979Z","response_start_time":"2019-04-04T16:22:33.979Z","upstream_cluster":"my-service","upstream_remote":{"address":"127.0.0.1","port":8080,"protocol":"TCP"}},"level":"info","msg":"","roletype":"my-role-type","service":"foobar","type":"log"}

The entry in Kibana looks like this:

The json output is being logged under the log field. I need the fields to be parsed and placed in the root document, and my understanding was json.keys_under_root: true would do the trick, but doesn't seem to. Below is my full config:

apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-config
  namespace: kube-system
  labels:
    k8s-app: filebeat
    kubernetes.io/cluster-service: "true"
data:
  filebeat.yml: |-
    filebeat.config:
      prospectors:
        path: ${path.config}/prospectors.d/*.yml
        reload.enabled: false
      modules:
        path: ${path.config}/modules.d/*.yml
        reload.enabled: false
    logging.level: debug
    output.elasticsearch:
      hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:443}']
      template.name: filebeat
      template.path: filebeat.template.json

---
apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-prospectors
  namespace: kube-system
  labels:
    k8s-app: filebeat
    kubernetes.io/cluster-service: "true"
data:
  kubernetes.yml: |-
    - type: log
      paths:
        - /var/lib/docker/containers/*/*.log
      json.add_error_key: true
      json.keys_under_root: true
      json.message_key: log
      processors:
        - add_kubernetes_metadata:
            in_cluster: true
            namespace: ${POD_NAMESPACE}

Any help in the right direction would be appreciated.

I see you have configured your docker containers as path, but type log. Docker itself embeds logs in JSON, which gives you JSON in JSON.

Have you tried the docker input type instead of 'log' ?

Actually turns out my log file had multi-lines in it. adding

multiline.pattern: '^{'
multiline.negate: true
multiline.match: after

as well as

- decode_json_fields:
    fields: ["log"]
    process_array: false
    max_depth: 5
    target: ""
    overwrite_keys: true

to the list of processors fixed the issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.