Parsing Nested JSON inside JSON getting NilClass

[ERROR] 2020-09-04 21:58:24.155 [[main]>worker0] ruby - Ruby exception occurred: undefined method 'each' for nil:NilClass
[ERROR] 2020-09-04 21:58:24.155 [[main]>worker0] ruby - Ruby exception occurred: undefined method 'each' for nil:NilClass
[WARN ] 2020-09-04 21:58:24.156 [[main]>worker0] split - Only String and Array types are splittable. field:results is of type = NilClass

{
          "host" => "localhost",
       "message" => "{",
          "type" => "json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_rubyexception",
        [2] "_split_type_failure"
    ],
          "path" => "/etc/logstash/conf.d/data1.json",
    "@timestamp" => 2020-09-04T19:58:20.042Z
}
{
          "host" => "localhost",
       "message" => "      \"port1\":{",
          "type" => "json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_rubyexception",
        [2] "_split_type_failure"
    ],
          "path" => "/etc/logstash/conf.d/data1.json",
    "@timestamp" => 2020-09-04T19:58:20.099Z
}

Config File

input {
    file {
        path => "/etc/logstash/conf.d/data1.json"
        type => "json"
        codec => "json"
        mode => "read"
        sincedb_path => "/dev/null"
        start_position => "beginning"
    }
}
filter {
     ruby {
        code => '
            val_a = []
            event.get("[results]").each do |k, v|
                val_a << v
            end
            event.set("[results]", val_a)
        '
    }
    split {
        field => "results"
    }
}
output {
    stdout {
        codec => rubydebug
    }
}

Following is the Data I want to parse and get all results with all other parameters from JSON Object and store in ES

{
"http_method": "GET",
"revision": "1598475252.192415",
"results": {
    "port1": {
        "id": "port1",
        "name": "port1",
        "alias": "",
        "mac": "00:00:00:00:00:00",
        "ip": "0.0.0.0",
        "mask": 0,
        "link": false,
        "speed": 0,
        "duplex": -1,
        "tx_packets": 0,
        "rx_packets": 0,
        "tx_bytes": 0,
        "rx_bytes": 0,
        "tx_errors": 0,
        "rx_errors": 0
    },
    "port2": {
        "id": "port2",
        "name": "port2",
        "alias": "",
        "mac": "00:00:00:00:00:00",
        "ip": "0.0.0.0",
        "mask": 0,
        "link": false,
        "speed": 0,
        "duplex": 0,
        "tx_packets": 0,
        "rx_packets": 0,
        "tx_bytes": 0,
        "rx_bytes": 0,
        "tx_errors": 0,
        "rx_errors": 0
    },
    "port3": {
        "id": "port3",
        "name": "port3",
        "alias": "",
        "mac": "00:00:00:00:00:00",
        "ip": "0.0.0.0",
        "mask": 0,
        "link": false,
        "speed": 0,
        "duplex": 0,
        "tx_packets": 0,
        "rx_packets": 0,
        "tx_bytes": 0,
        "rx_bytes": 0,
        "tx_errors": 0,
        "rx_errors": 0
    }
},
    "vdom": "root",
    "path": "system",
    "name": "interface",
    "status": "success",
    "serial": "FG1K5D3I14801563",
    "version": "v6.2.3",
    "build": 1066
}

Help will be appreciated.
Thanks

That looks like you have pretty-printed JSON in a file, which a file filter consumes one line at a time. None of those lines are going to be valid JSON, so the json codec will get a parse failure. Then the ruby filter will get an exception because there is no [results], then the split filter will get an exception for the same reason (the ruby filter does not get as far as adding the empty array if it gets an exception).

See here for the solution.

The multiline code will replace the json code so you will have to add a json filter.

Thanks, used that but Now I'm getting this

split - Only String and Array types are splittable. field:results is of type = Hash

Are you still using the ruby filter to convert the results hash to an array? Maybe try a different field name when doing the event.set and split, just for testing purposes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.