My architecture consists of Kafka logstash elasticsearch, i wrote a ruby code to calculate a KPI in logstash, everything was working fine but some days pass by and that KPI i made has been reset to 0 and only that KPI, i've also created some scripted fields in kibana i somehow messed them up and immediately got data loss when i was observing my saved search not everything but some of them, so i deleted those scripted fields and created proper ones that they where successful but after some time the same issue got back
this is my ruby code at the logstash level
mutate { add_field => { "[@metadata][task]" => "constant" } } aggregate { task_id => "%{[@metadata][task]}" code => ' map["total"] ||=0 t = event.get("[payload][type]") s = event.get("[payload][status]") if t == "topup" && s == "success" map["total"] += event.get("[payload][amount]") elsif t == "cashout" && s == "success" map["total"] -= event.get("[payload][amount]") end event.set("total", map["total"]) ' }
and the scripted fields where normal doc["fieldname"] i've just used it to rename some fields