Kafka input json

Hi,

I have input coming from kafka topic to logstash. Which is json data.

JSON structure of my data

"field1" : "val1",
"field2" : "val2",
"field3" : {
"field4" : {
"field5" : "val5",
"field6" : "val6"
}
}

input {
kafka {
topic_id => 'topic_name'
zk_connect => 'zk_host:2181'
codec => "json"
add_field => { "test1" => "%{field3}"
"test2" => "%{field1}"
"test3" => "%{field5}"
"test4" => "%{field3.field4}"
"test5" => "%{field3.field4.field5}"
}
}
}

in the above input from conf file, I am getting new fields "test1" and "test2" values from input data.

Where as I am not getting the value for "test5". which is nested element in the json data.

Please let me know what is missing here.

I have tried with JSON filter and getting json parse failure. I tried to install json_encode plugin, which is showing incorrect URL to get the plugin as mentioned below

$ plugin install logstash-filter-json_encode --verbose

-> Installing logstash-filter-json_encode...
Trying https://github.com/null/logstash-filter-json_encode/archive/master.zip...
Failed: IOException[Can't get https://github.com/null/logstash-filter-json_encode/a

Thanks

I got "test5", If I modify the field as below

"test5" => "%{[field3][field4][field5]}"

But what about json_encode plugin install issue

versions that are using here are
elasticsearch-1.7.3
logstash-1.5.4