Add field from JSON / logstash filter

Hi
I am trying to add a filed from the parsed JSON, but the logstash filter does not add the value of the filed, instead it just add the whole string from the add field section.

test.conf

input {
stdin { codec => json }

}

filter {
json {
source => "message"
target => "parsedjson"
}
mutate {
add_field =>
{
"Source_IP" => "%{[parsedjson][SOURCEIP]}"
# This source IP , i want add
}
}
}

output { stdout { codec => rubydebug } }

The config test of the above-

$ echo '{ "_index": "syslog-all-rq-2016.12.17", "_type": "syslog-all", "_id": "AVkNjl7jus_W3GFgAZNi", "_score": null, "_source": { "TAGS": ".source.syslog_tcp", "SOURCEIP": "172.19.10.220", "PROGRAM": "369", "PRIORITY": "notice", "MESSAGE": "<14>1 2016-12-17T17:12:29+01:00 TirougaII WinFileService - - [synolog@6574 synotype="WinFileService" ip="172.19.10.225" luser="pc" event="read" isdir="File" fsize="6.00 KB" fname="/DATA/800 ProductionTST/Thumbs.db"][meta sequenceId="62"] Event: read, Path: /DATA01/SOC/800 ProductionTST/Thumbs.db, File/Folder: File, Size: 6.00 KB, User: dtu, IP: 172.18.9.251", "LEGACY_MSGHDR": "369 ", "HOST_FROM": "172.19.10.220", "HOST": "172.19.10.220", "FACILITY": "user", "DATE": "Dec 17 17:12:29", "@version": "1", "@timestamp": "2016-12-17T16:12:29.507Z", "host": "127.0.0.1", "port": 35186, "type": "syslog-all", "tags": [ "Syslog-All" ] }, "fields": { "@timestamp": [ 1481991149507 ] }, "sort": [ 1481991149507 ] }' | /opt/logstash/bin/logstash -f test.conf
Settings: Default pipeline workers: 2
Pipeline main started
{
"_index" => "syslog-all-rq-2016.12.17",
"_type" => "syslog-all",
"_id" => "AVkNjl7jus_W3GFgAZNi",
"_score" => nil,
"_source" => {
"TAGS" => ".source.syslog_tcp",
"SOURCEIP" => "172.19.10.220",
"PROGRAM" => "369",
"PRIORITY" => "notice",
"MESSAGE" => "<14>1 2016-12-17T17:12:29+01:00 TirougaII WinFileService - - [synolog@6574 synotype="WinFileService" ip="172.19.10.225" luser="pc" event="read" isdir="File" fsize="6.00 KB" fname="/DATA/800 ProductionTST/Thumbs.db"][meta sequenceId="62"] Event: read, Path: /DATA01/SOC/800 ProductionTST/Thumbs.db, File/Folder: File, Size: 6.00 KB, User: dtu, IP: 172.18.9.251",
"LEGACY_MSGHDR" => "369 ",
"HOST_FROM" => "172.19.10.220",
"HOST" => "172.19.10.220",
"FACILITY" => "user",
"DATE" => "Dec 17 17:12:29",
"@version" => "1",
"@timestamp" => "2016-12-17T16:12:29.507Z",
"host" => "127.0.0.1",
"port" => 35186,
"type" => "syslog-all",
"tags" => [
[0] "Syslog-All"
]
},
"fields" => {
"@timestamp" => [
[0] 1481991149507
]
},
"sort" => [
[0] 1481991149507
],
"@version" => "1",
"@timestamp" => "2016-12-18T17:50:14.713Z",
"host" => "pc-virtual-machine",
"Source_IP" => "%{[parsedjson][SOURCEIP]}"

}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

The value of source ip is not added instead all the parameters name in add_filed has been added.

Like to know from the experts what am i doing wrong ?

Thanks

Hi
The problem is here
add_field =>
{
"Source_IP" => "%{[_source][SOURCEIP]}"
}

I have overlooked the JSON structure, added the _source array name and source ip is being added now.

Awesome, glad you figured it out Makra. I was just setting up a test instance in hopes of helping you.

Would you please add an update in our other post as well.

Best of luck on your ELK stack journey!

Hi
Troy,
Now the elastic search indexes are generated based on the IP. I will update the other post as well.
Thanks for your valuable suggestions and time.:relaxed:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.