we have telemetry output coming to logstash via kafka which we are splitting to three levels to get the required data out of it. Here, I see that filter block output is fine but when the output flow is sending data to elasticsearch it is just reusing last split event value in all the output messages. Example is given below:
Filter data after splitting to 3 levels :
{ name : a
value : 10
},
{ name : b,
value : 20
},
{ name : c,
value : 30
}
Output message:
{ name : a
value : 10
},
{ name : a,
value : 10
},
{ name : a,
value : 10
}
I don't see any exception in the debug logs.
Please let me know if any details are needed.
Regards,
Sukumar