How to individualize the data that has an array objects

I need to be able to extract the data that it brings me in logstash, to be able to graph I have to extract the metrics, but when it sends them to me to kibana I cannot graph because they are in an array, I need help as I cannot break that array and that it sends me the individual metrics each one.

I need to be able to get the data out of that array and I need to separate them all, whether they are individual metrics, please.

{
  "_index": "radwareefecty",
  "_type": "_doc",
  "_id": "ird8YXkBQzzVpIqvFtvR",
  "_version": 1,
  "_score": null,
  "_source": {
    "@version": "1",
    "SlbStatLinkpfRServerTable": [
      {
        "TotBwPeak": 0,
        "IpAddr": "190.145.144.65",
        "State": 2,
        "DnBwTot": 0,
        "CurrSess": 0,
        "DnBwPeak": 0,
        "TotBwPeakTmSt": "N/A",
        "TotCurrUsage": "--",
        "UpBwPeak": 0,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "N/A",
        "DnBwPeakPer": "--",
        "UpBwTot": 0,
        "UpDnBwTot": 0,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "Claro-Internet4",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "N/A",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 0,
        "IpAddr": "190.242.127.16",
        "State": 2,
        "DnBwTot": 0,
        "CurrSess": 0,
        "DnBwPeak": 0,
        "TotBwPeakTmSt": "N/A",
        "TotCurrUsage": "--",
        "UpBwPeak": 0,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "N/A",
        "DnBwPeakPer": "--",
        "UpBwTot": 0,
        "UpDnBwTot": 0,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "Columbus-Internet5",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "N/A",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 1.5,
        "IpAddr": "190.143.70.35",
        "State": 2,
        "DnBwTot": 3,
        "CurrSess": 0,
        "DnBwPeak": 0.1,
        "TotBwPeakTmSt": "02:28:52 Sat May  1, 2021",
        "TotCurrUsage": "--",
        "UpBwPeak": 1.4,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "01:53:29 Sat May  1, 2021",
        "DnBwPeakPer": "--",
        "UpBwTot": 1617.9,
        "UpDnBwTot": 1620.9,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "Internet_Navegacion-Claro_1",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "02:28:52 Sat May  1, 2021",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 5.8,
        "IpAddr": "190.144.221.225",
        "State": 2,
        "DnBwTot": 5.3,
        "CurrSess": 0,
        "DnBwPeak": 0.3,
        "TotBwPeakTmSt": "01:52:18 Sat May  1, 2021",
        "TotCurrUsage": "--",
        "UpBwPeak": 5.4,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "01:52:18 Sat May  1, 2021",
        "DnBwPeakPer": "--",
        "UpBwTot": 125.7,
        "UpDnBwTot": 130.9,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "Internet_Navegacion-Claro_2",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "01:52:18 Sat May  1, 2021",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 0.1,
        "IpAddr": "200.122.229.81",
        "State": 2,
        "DnBwTot": 0,
        "CurrSess": 0,
        "DnBwPeak": 0,
        "TotBwPeakTmSt": "19:53:13 Wed Apr 14, 2021",
        "TotCurrUsage": "--",
        "UpBwPeak": 0,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "19:53:13 Wed Apr 14, 2021",
        "DnBwPeakPer": "--",
        "UpBwTot": 0.9,
        "UpDnBwTot": 1,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "Internet_Navegacion-UNE_1",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "19:53:13 Wed Apr 14, 2021",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 55,
        "IpAddr": "10.10.105.16",
        "State": 1,
        "DnBwTot": 0,
        "CurrSess": 0,
        "DnBwPeak": 0,
        "TotBwPeakTmSt": "01:53:34 Sat May  1, 2021",
        "TotCurrUsage": "--",
        "UpBwPeak": 55,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "N/A",
        "DnBwPeakPer": "--",
        "UpBwTot": 55.6,
        "UpDnBwTot": 55.6,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "L2L_Claro",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "01:53:34 Sat May  1, 2021",
        "TotBwPeakPer": "--"
      },
      {
        "TotBwPeak": 0.8,
        "IpAddr": "10.10.100.16",
        "State": 1,
        "DnBwTot": 0,
        "CurrSess": 0,
        "DnBwPeak": 0,
        "TotBwPeakTmSt": "01:52:38 Sat May  1, 2021",
        "TotCurrUsage": "--",
        "UpBwPeak": 0.8,
        "DwBwUSage": "--",
        "DnBwPeakTmSt": "N/A",
        "DnBwPeakPer": "--",
        "UpBwTot": 3.8,
        "UpDnBwTot": 3.8,
        "UpBwPeakPer": "--",
        "LastTranfetTmSt": "N/A",
        "DwBwCurr": 0,
        "Index": "L2L_CyW",
        "UpBwCurr": 0,
        "UpBwUsage": "--",
        "TotCurrbw": 0,
        "UpBwPeakTmSt": "01:52:38 Sat May  1, 2021",
        "TotBwPeakPer": "--"
      }
    ],
    "@timestamp": "2021-05-12T16:48:00.064Z"
  },
  "fields": {
    "@timestamp": [
      "2021-05-12T16:48:00.064Z"
    ]
  },
  "sort": [
    1620838080064
  ]
}

this is the configuration in logstash:

input {
  http_poller {
    urls => {
      kvh => "https://Default_Generated_Alteon_BBI_Cert:443/config/SlbStatLinkpfRServerTable"
     }
#    cacert => "/path/downloaded_cert.pem"
    truststore => "/path/downloaded_truststore.jks"
    user => "user"
    password => "secret"
    truststore_password => "secret"
    schedule => { cron => "* * * * * UTC"}
    codec => "json"
#    ssl => true
 #   ssl_certificate_verification => true
  }
}

filter {

 ruby {
        code => '
            def is_number? string
                true if Float(string) rescue false
            end

            t = event.get("SlbStatLinkpfRServerTable")
            if t
                newT = []
                t.each { |x|
                    newX = {}
                    x.each { |k, v|
                        if is_number? v
                            v = v.to_f
                        end
                        newX[k] = v
                    }
                    newT << newX
                }
                t = event.set("SlbStatLinkpfRServerTable", newT)
            end
        '
    }
}

output {
   stdout { codec => rubydebug }
 elasticsearch {
    hosts => ["https://425dc991b9ec443hgggddkkkqf3ef706bd675a.us-central1.gcp.cloud.es.io:9243"]
    user => "elastic"
    password => "secret"
    index => "radwareefecty"
   }

}

If you want a separate event for each array entry you could use

split { field => "SlbStatLinkpfRServerTable" }

I need to extract the data contained in the array, i.e. extract each field that the array contains. take this data and extract it from the array:

What do you want the _source field to look like?

I want them to be outside the array, I mean individual metrics of the array.

image

I mean, take each metric from the array and put it in individual

this appears, when using the 'split:

{

                     "@version" => "1",
                   "@timestamp" => 2021-05-12T18:00:00.599Z,
    "SlbStatLinkpfRServerTable" => [
        [ 0] {
            "UpBwPeak" => nil,
                 "0.0" => nil
        },
        [ 1] {
            "Claro-Internet4" => nil,
                      "Index" => nil
        },
        [ 2] {
            "TotCurrUsage" => nil,
                      "--" => nil
        },
        [ 3] {
            "DwBwCurr" => nil,
                 "0.0" => nil
        },
        [ 4] {
                  "0.0" => nil,
            "TotCurrbw" => nil
        },
        [ 5] {
            "DwBwUSage" => nil,
                   "--" => nil
        },
        [ 6] {
            "State" => nil,
                "2" => nil
        },
        [ 7] {
                        "N/A" => nil,
            "LastTranfetTmSt" => nil
        },
        [ 8] {
            "DnBwPeakPer" => nil,
                     "--" => nil
        },
        [ 9] {
            "DnBwTot" => nil,
                "0.0" => nil
        },
        [10] {
            "DnBwPeakTmSt" => nil,
                     "N/A" => nil
        },
        [11] {
                     "N/A" => nil,
            "UpBwPeakTmSt" => nil
        },
        [12] {
            "TotBwPeakPer" => nil,
                      "--" => nil
        },
        [13] {
                  "0.0" => nil,
            "TotBwPeak" => nil

I cannot conceive how a split filter would do that.

Where's the data coming from? Are you able to adjust the source to output friendlier json?

Otherwise, if the array is the same everytime you could just copy out the values to new fields?

come through the api that this calling, the data are in a web in radware that are compiled in arrays, when calling them through the api to send them from logstash to elastic, I suppose that from the configuration of the file in logstash must see some script that goes through the array and I can take the data and separate them so that they do not remain as tables.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.