So I am parsing some files that contain data for devices that are stored as single line nested JSON.
I am able to parse the data from the files using the the "json filter" logstash filter.
My logstash .conf
# READ - READS FILES WITH .system EXTENSION
input {
file {
mode => "read"
path => ["/root/sysinfo/*.system"]
type => "system-info"
sincedb_path => "/dev/null"
}
}
#
# FILTER - TRY TO FILTER THE INPUT FILE USING JSON FILTER
filter {
if [type] == "system-info" {
json {
source => "message"
add_tag => [ "json-parsed" ]
}
}
}
output {
stdout { }
}
A portion of the output after that looks like this
{
"error" => 0,
"receivetime" => "15-01-2020 04:01:35.306",
"message" => "OK",
"type" => "system-info",
"Description" => "Cisco",
"@timestamp" => 2020-01-23T23:21:32.245Z,
"kind" => "system",
"data" => [
[ 0] {
"properties" => {
"value" => "Cisco Adaptive Security Appliance Version 9.2(4)33",
"key" => "System Description"
},
"displayname" => "System Description: Cisco Adaptive Security Appliance Version 9.2(4)33"
},
[ 1] {
"properties" => {
"value" => "1.3.6.1.4.1.9.1.745",
"key" => "ObjectID"
},
"displayname" => "ObjectID: 1.3.6.1.4.1.9.1.745"
},
[ 2] {
"properties" => {
"value" => "email@email.com",
"key" => "Contact Info"
},
}
],
"@version" => "1",
"host" => "hostname",
"path" => "/root/sysinfo/testing.system",
"tags" => [
[0] "json-parsed"
]
}
Is it possible to utilize the data in the fields below only and run them through the "kv filter" logstash filter?
"displayname" => "ObjectID: 1.3.6.1.4.1.9.1.745"
"displayname" => "System Description: Cisco Adaptive Security Appliance Version 9.2(4)33"
To index into elasticsearch as
"ObjectID" => "1.3.6.1.4.1.9.1.745"
"System Description" => "Cisco Adaptive Security Appliance Version 9.2(4)33"
Thanks in advance!