Prevent the loss of existing fields while using document_id of elasticsearch output plugin

I need to inject multiple logs in a common document according to a specific field value.
So, I specify document_id's value in elasticsearch plugin output.

But instead of adding fields after parsing logs it deletes existing fields (made from former logs) and add the current new fields, so I lose information.
Althought my fields has not the same name so it can't do overwritting.

How to prevent from deleting existing fields ?

Use the doc_as_upsert option?

yes like this

output {
  elasticsearch {
    hosts => ["localhost:9200"]
	index => "test-%{+YYYY.MM.dd}"
	doc_as_upsert => true
	document_id => "doc-%{[id_essai]}"

But there was no differrence using "doc_as_upsert = true" and "doc_as_upsert = false"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.