Parse json file with 100k/1 million records?

hi @Badger,

we have solved the issue , but now i am facing a different issue which is smilar to below link

some fields are added to the end of json like this

{"docId": "201","docOwner": "R909168111","docType":"delay","event1Timestamp":"2019-08-20T04:28:40.889Z", "addinfo":[{"key": "clid","value": "p111"}]
}

filter {

mutate {
strip => ["message"]
}

if [message] == ']}' {
drop{}
}

mutate { gsub => ["message",",$",""] }

json {
source => "message"
}

}

but how to extract the nested array of inner objects and inserted as it is elasticsearch?

i get the output in logstash debug as

"addInfo" => [
[0] {
"key" => "clid",
"value" => "p111"
},
[1] {
"key" => "clid1",
"value" => "p222"
}
],

but in elasticsearch the o/p is :-

"addInfo" : [
{
"key" : "clid",
"value" : "p111"
},
{
"key" : "clid1",
"value" : "p222"
}
],

is that fine ?