Filebeat decode_json_fields isn't parssing arrays

Hi,

We are using filbeat processor decode_json-fields to process log messages in Json.
The problem we're having is that some of our logs are multi-layered with quite a few arrays and some nested objects. We tried using decode_json_fields with the process_array flag set to true, but Filebeat still parce everything that follows '[' in a single field.

This is what we get on Kibana's Discovery:

 @timestamp                   Oct 28, 2019 @ 12:22:06.610
 t _id                        pKEaEm4B7zyLz8s9M8Xe
 t _index                     filebeat-7.3.2-2019.10.28-000001
 # _score                     -
 t _type                      _doc
 t agent.ephemeral_id         7c3cd7b7-2f76-424e-a417-5aa82f119bed
 t agent.hostname             ******
 t agent.id                   571154fa-e864-49b1-a224-9d405befeddf
 t agent.type                 filebeat
 t agent.version              7.3.2
 ? circuitPath                { "policy": "Health Check LB", "execTime": 0, "filters": [ { "class": "com.vordel.circuit.attribute.CompareAttributeFilter", "status": "Pass", "filterTime": 1557733771853, "execTime": 0, "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-439438454261778670", "name": "Compare Attribute", "type": "CompareAttributeFilter" }, { "execTime": 0, "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-6704867506249825459", "name": "Set Message - OK", "type": "ChangeMessageFilter", "class": "com.vordel.circuit.conversion.ChangeMessageFilter", "status": "Pass", "filterTime": 1557733771853 }, { "execTime": 0, "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-5308572925601299001", "name": "Reflect - OK", "type": "ReflectFilter", "class": "com.vordel.circuit.net.ReflectFilter", "status": "Pass", "filterTime": 1557733771853 } ] }
 ? correlationId              *******************
 t ecs.version                1.0.1
 t host.name                  *****
 t input.type                 log
 t log.file.path              *****
 # log.offset                 747,788
 ? processInfo.domainId       *******************
 ? processInfo.groupId        group-2
 ? processInfo.groupName      ******
 ? processInfo.hostname       f3slsea310
 ? processInfo.serviceId      instance-6
 ? processInfo.serviceName    ******
 ? processInfo.version        7.6.2 SP1
 suricata.eve.timestamp       Oct 28, 2019 @ 12:22:06.610
 ? timestamp                  1557733771854

As you can see, the minute Filebeat gets to the nested array "circuitPath" it parsses everything in a single field until the array is closed.

Here is an example of one of the logs that we're having a problem with :

{"timestamp":1557733646862,"correlationId":"***************","processInfo":{"hostname":"f3slsea310","domainId":"*******************","groupId":"group-2","groupName":"*****","serviceId":"instance-6","serviceName":"*******","version":"7.6.2 SP1"},"circuitPath":[ { "policy": "Health Check LB", "execTime": 0, "filters": [  { "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-439438454261778670", "name": "Compare Attribute", "type": "CompareAttributeFilter", "class": "com.vordel.circuit.attribute.CompareAttributeFilter", "status": "Pass", "filterTime": 1557733646861, "execTime": 0 } , { "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-6704867506249825459", "name": "Set Message - OK", "type": "ChangeMessageFilter", "class": "com.vordel.circuit.conversion.ChangeMessageFilter", "status": "Pass", "filterTime": 1557733646861, "execTime": 0 } , { "espk": "PrimaryStore-43595d15-05f6-4135-aa9a-e8b9b2a35bda:-5308572925601299001", "name": "Reflect - OK", "type": "ReflectFilter", "class": "com.vordel.circuit.net.ReflectFilter", "status": "Pass", "filterTime": 1557733646861, "execTime": 0 }  ] } ]}

Filebeat.yml

processors:
   - decode_json_fields:
       fields: [message]
       max_depth: 11
       process_array: true
       overwrite_keys: true

Have you tried setting max_depth? See more: https://www.elastic.co/guide/en/beats/filebeat/current/decode-json-fields.html

Hi @kvch,

Yes, i have tried max_depth but it didn't change anything. I even tested multiline so that maybe it'll process the array located in the middle of the log but it just made things worst.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.