Not able to extract speciific fields in nested multiline json file

Hi,

I have a multiline nested json file in the below format

{
"info": {
"name": "name1",
"ip": "12.12.12.12",
"jobid": "aaa",
},
"ssss": "ssdsdsdsdsdsd",
"configure": {
"something": false
},
"tests": {
"tc1": {
"name": "test1",
"ip": "12.23.34.45",
"flags": [
"xxx",
"yyy"
],
"type": "sys"
},

    "tc2": {
        "flags": [
            "rrr", 
            "ggg"
        ], 
        "ip": "1.2.3.4"
    }
},

some more data goes here
.........
........
}

I want to extract [tests][tc1][ip], [tests][tc2][ip]

I did try different filters, but none seems to be working. With the below config, logstash pushes all the stuff in json file to elasticsearch

input
{
file
{

    path => "path to json file"
    start_position => "beginning"
    sincedb_path => "/dev/null"
	codec => multiline
    {
        pattern => '^\"tests\":'
        negate => true
        what => previous                
    }
    
}

}

filter {

mutate{
add_field => ["newfield", "%{[tests][0][tc1]}"]
add_field => ["newfiled2", "%{[tests][0][tc2]}"]

}
}
output
{
stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "test_index"
document_type=> "Test_script"

    }

}

Could you please help me on this ASAP to resolve the issue?
Thanks in advance

@sharath3185

Hi, have you solved this issue ? I am facing a similar situation.
Kindly revert if you have a solution . Thanks!