I have a vendor appliance that sends syslog events in JSON format. I am able to receive the message, parse the syslog header with grok and obtain the JSON data in a field from the message portion. The JSON field looks something like this (in rubydebug format):
"data" => {
"msg" => "abc",
"version" => "1.0.0"
"event" => {
"class" => "warn",
"label => "SD3"
"src" => {
"ip" => "10.23.2.155",
"port" => 52901
}
}
}
It is actually way more complex than this and I don't wish to keep all the data. I try to use mutate to add a new field for the data I want to keep and that matches the format from other sources.
filter {
grok {
match => { "message" => "^\<%{NUMBER:sl_pri}\>%{SYSLOGTIMESTAMP:sl_time {SYSLOGHOST:sl_host} %{SYSLOGPROG}: %{GREEDYDATA:sl_msg}" }
}
json {
source => "sl_msg"
target => "data"
}
mutate {
add_field => { "src_ip" => "%{data.event.src.ip}" }
}
}
However, with this filter, the logstash output contains the field:
"src_ip" => "%{data.event.src.ip}"
...instead of...
"src_ip" => "10.23.2.155"
Any clues why I'm not able to reference the JSON field values?