Utilize Data Parsed From Nested JSON

So I am parsing some files that contain data for devices that are stored as single line nested JSON.

I am able to parse the data from the files using the the "json filter" logstash filter.

My logstash .conf

 # READ - READS FILES WITH .system EXTENSION

input {
       file {
                mode => "read"
                path => ["/root/sysinfo/*.system"]
                type => "system-info"
                sincedb_path => "/dev/null"
         }
}


#
# FILTER - TRY TO FILTER THE INPUT FILE USING JSON FILTER

filter {

        if [type] == "system-info" {
        json {

                source => "message"
                add_tag => [ "json-parsed" ]
                }
        }
                  
}




output {

stdout { }

  }

A portion of the output after that looks like this

{
          "error" => 0,
    "receivetime" => "15-01-2020 04:01:35.306",
        "message" => "OK",
           "type" => "system-info",
    "Description" => "Cisco",
     "@timestamp" => 2020-01-23T23:21:32.245Z,
           "kind" => "system",
           "data" => [
        [ 0] {
             "properties" => {
                "value" => "Cisco Adaptive Security Appliance Version 9.2(4)33",
                  "key" => "System Description"
            },
            "displayname" => "System Description: Cisco Adaptive Security Appliance Version 9.2(4)33"
        },
        [ 1] {
             "properties" => {
                "value" => "1.3.6.1.4.1.9.1.745",
                  "key" => "ObjectID"
            },
            "displayname" => "ObjectID: 1.3.6.1.4.1.9.1.745"
        },
        [ 2] {
             "properties" => {
                "value" => "email@email.com",
                  "key" => "Contact Info"
            },
		}
		],
       "@version" => "1",
           "host" => "hostname",
           "path" => "/root/sysinfo/testing.system",
           "tags" => [
        [0] "json-parsed"
    ]
}

Is it possible to utilize the data in the fields below only and run them through the "kv filter" logstash filter?

"displayname" => "ObjectID: 1.3.6.1.4.1.9.1.745"
"displayname" => "System Description: Cisco Adaptive Security Appliance Version 9.2(4)33"

To index into elasticsearch as

"ObjectID" => "1.3.6.1.4.1.9.1.745"
"System Description" => "Cisco Adaptive Security Appliance Version 9.2(4)33"

Thanks in advance!

You would do that in a ruby filter. I have not tested this but it should guide you...

ruby {
    code => '
        event.get("[data]").each { |x|
            k = x["properties"]["key"]
            v = x["properties"]["Value"]
            if [ "ObjectID", "System Description"].include? k
                event.set(k, v)
            end
        }
    '
}

That worked like a charm! Just had to lowercase "value". Thanks bud!

This code works, however it doesn't account for duplicate identical key's that have different values.

[170] {
"properties" => {
"value" => "ABC12345",
"id" => "4",
"key" => "Serial Number (WS-C2960X-48LPS-L)"
},
"displayname" => "Serial Number (WS-C2960X-48LPS-L)#4: FCW2029B2EP"
},

[177] {
"properties" => {
"value" => "654321ABC",
"id" => "11",
"key" => "Serial Number (WS-C2960X-48LPS-L)"
},
"displayname" => "Serial Number (WS-C2960X-48LPS-L)#11: FOC2233V0FJ"
}

Would you know of a conditional that could preserve the identical keys by incrementing or renaming them somehow?

Finding how many of a given key the filter has previously seen would be complicated. You could modify the code so that instead of doing event.set it stashes the key and value in a local hash, making sure the value is always an array. Then after iterating over [data] iterate over the local hash, and if the length of the array is 1, do an event.set of key with the member 0 of the array, otherwise iterate over the array with each_index and do an event.set using string interpolation.

a.each_index { |x|
    event.set("#{k}#{x}", v)
}

Ok I'll look into that, thanks for your help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.