FileBeat decode_json_fields processor max_depth option not working

decode_json_fields -> max_depth option not working or documentation misunderstanding.

To prevent creating tons of document fields in an Elasticsearch log index I want to control nested JSON parsing depth.

Filebeat version 7.8.0 (also tested on 6.8.10 and result is the same)

/tmp/filebeat.conf:

filebeat.inputs:
- type: log
  paths:
    - /tmp/filebeat.input

processors:
  - decode_json_fields:
      fields: ["message"]
      max_depth: 1
      target: "parsed"

output.console:
  pretty: true

/tmp/filebeat.input:

{"top": "top_value", "top_obj": {"level_1": "level_1_value", "level_1_obj": {"level_2": "level_2_value", "level_2_obj": {"level_3": "level_3_value"}}}}

Command:

filebeat  -e -c /tmp/filebeat.conf

Result:

"parsed": {
  "top_obj": {
    "level_1_obj": {
      "level_2": "level_2_value",
      "level_2_obj": {
        "level_3": "level_3_value"
      }
    },
    "level_1": "level_1_value"
  },
  "top": "top_value"
}

Expected result:

"parsed": {
  "top_obj": {
    "level_1_obj": "{\"level_2\": \"level_2_value\", \"level_2_obj\": {\"level_3\": \"level_3_value\"}}",
    "level_1": "level_1_value"
  },
  "top": "top_value"
}

Thank you for reporting this! I was able to reproduce this.

Could you please open a Github issue reporting this potential bug?

C.

Thank you!

Related GitHub Issue: https://github.com/elastic/beats/issues/19830

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.