Filebeat - how control level nested json object parsing - decode_json_fields

Filebeat - how can I control level of decode_json_fields ?
max_depth seems not help in my case :frowning:

goal: parsing '/var/lib/docker/containers/*/*.log' but controlling max json depth

name: "host-01"
queue:
  mem:
    events: 16384
    # batch of events to the outputs. "0" ensures events are immediately available to be sent to the outputs.
    flush.min_events: 0


filebeat:
  prospectors:
    - type: log
      paths:
       - '/tmp/test.log'
      json:
        # key on which to apply the line filtering and multiline settings
        message_key: log
        keys_under_root: true
        add_error_key: true
      processors:
      - decode_json_fields:
          fields: ["log"]
          process_array: false
          max_depth: 1
          overwrite_keys: false

output:
  console:
    pretty: true

Example

echo '{"log":"{ "status": { "foo": { "bar": 1 } }, "bytes_sent": "0", "gzip_ratio": "-", "hostname": "cb7b5441f0da" }\n","stream":"stdout","time":"2018-12-29T11:25:36.130729806Z"}' >> /tmp/test.log

Actual result:

{
...
  "log": {
    "status": {
      "foo": {
        "bar": 1
      }
    },
    "bytes_sent": "0",
    "gzip_ratio": "-",
    "hostname": "cb7b5441f0da"
...
}

Expected result:

{
...
  "log": {
    "status": "{  \"foo\": { \"bar\": 1 } }"
   },
  "bytes_sent": "0",
  "gzip_ratio": "-",
  "hostname": "cb7b5441f0da"
...
}

How to control nested json objects?

here is some explanation max_depth nested json not working · Issue #9834 · elastic/beats · GitHub
but removing json: and leave only decode_json_fields doesn't help

Your example JSON could not be parsed unless I removed " around the value of log.
However, the processor seems to ignore max_depth option. I need a bit more time to investigate the issue. Thank you for your patience.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.