I havea JSON with array element in it
{
"K1": "V1",
"k2":"v2",
"k3": [{"a1":"b1","a2":"b2"},{"a1":"c1","a2":d2}]
}
After logstash filter, I saving this in Elastic search, but for single json record its inserting two records
Record -1
{
"id":1,
"K1": "V1",
"k2":"v2",
"a1":"b1",
"a2":"b2"
}
Record -2
{
"id":2,
"K1": "V1",
"k2":"v2",
"a1":"c1",
"a2":"d2"
}
Expected :
{
"id":1,
"K1": "V1",
"k2":"v2",
"a1.0":"b1",
"a2.0":"b2"
"a1.1":"c1",
"a2.1":"d2"
}
I tried making key unique thinking this causing the issue, but still, it created two records. Fundamentally I want to understand what is causing two record creation, I assume its split filter. Please help me in understanding. I will share my config if required.