Filebeat decode_json_fields failed

If the length of the message above is greater than 1116, it will be lost after formatting! When using filebeat decode_json_fields to parse json object, if the field content of json key is too long, it will be discarded directly. Which attribute should be set?

Message field is a string type. No matter any character entered, if it exceeds a certain length, it will be lost after formatted and stored in es.

The following is the content of my profile

- type: log
  enabled: true
    - /opt/site/TestAPI/collectlog/current.log
  multiline.pattern: ^{ "time"
  multiline.negate: true
  multiline.match: after
fields_under_root: true
fields: {server: "VMTest" }

# ======================= Elasticsearch template setting =======================

  index.number_of_shards: 1
  index.number_of_replicas: 0
setup.template.type: index
setup.template.enabled: true
setup.template.overwrite: true
setup.template.fields: "yy_server_log_fields.yml" "yy_server_log_index_template"
setup.template.pattern: "yy_server_log*"
setup.ilm.enabled: false 
  hosts: ["localhost:9200"]
  index: "yy_server_log_%{+yyyy.MM.dd}"
  #protocol: "https" 
  #api_key: "id:api_key"
  username: "elastic"
  password: "some_password"

# ================================= Processors =================================
  - decode_json_fields:
      fields: ["message"]
      process_array: false
      max_depth: 1
      target: "yylog"
      overwrite_keys: false
      add_error_key: true
  - drop_fields:
      fields: ["log","host","input","ecs","agent"] 
      ignore_missing: true 

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.