Process specific dynamic array field from JSON file

Hi people!

Recently I've got in a situation where there is a service that posts it's logs on a single .json file in deeper level field.

The structure is something like this...

{
    "field1": "value1",
    "field2": "value2",
    "field3": "value3",
    "messages": [
        {
            "id": "0x00001",
            "message": "Message 1"
        },
        {
            "id": "0x00002",
            "message": "Message 2"
        },
        {
            "id": "0x00003",
            "message": "Message 3"
        },
        {
            "id": "0x00004",
            "message": "Message 4"
        }
    ]
}

So my question is... is there any way that I could read from that sepecific field messages with Filebeat?

Hi

I don't know if that really answers your question, but you need to add that
to your filebeat.yml file in the "Processors" section following what's already there.

 - decode_json_fields:
      fields: ['message']
      target: ""

Restarts Filebeat.

Tell me if it works

Hi, thanks for the suggestion!

It indeed processes the field message relating it's fields all in the same log line, placing several array items as fields in a single log line, now what I was trying to achieve was a little different than this...

What I was trying to do is to separate these array items into several lines of logs via Filebeat (so far couldn't make it)

But if there is no built-in solution for this case, I think I would have to generate another plaintext file from this JSON (via cron or some kind of daemon), separating each item of the array into a compressed JSON to be processed then line by line by filebeat

Oh okay, I understand

I'm a beginner on the ELK stack so I don't know how to help you but good luck

1 Like