Sending new fields to exisiting Elastic document, losing old data

I have solved it myself :smiley:

based on this post: doc_as_upsert vs upsert properties

It was required to make it an upsert.

so my logstash config looks something like this

input {
udp {
port => 10001
codec => json
}
}

filter {
fingerprint {
source => "id"
target => "[@metadata][fingerprint]"
method => "MURMUR3"
}
}

output {
elasticsearch {
hosts => ["https://127.0.0.1:9200"]
user => "logstash_account"
password => "some_password_here"
cacert => "/etc/logstash/conf.d/cacert.cer"
ssl_certificate_verification => false
document_id => "%{[@metadata][fingerprint]}"
index => "some_index"
doc_as_upsert => true
action => update
}
}