"doc_as_upsert" overwrites logs instead of appending the logs

Hello Elastic,

I have a filebeat agent that harvests logs. It sends the logs to Logstash on the same server, which sends it after to Elastic Cloud, the field "message" has the value of the log on Elastic cloud.

After a document has been created, if there are updates, I would like to append them to the existing logs in the same document, in the field "message". But unfortunately, the update overwrites the previous log in "message"

If the log is updated, filebeat sends to logstash only the updated message, and not all of it to logstash, which seems the normal course.

Tried lot of combination, but I can't get it to work, everytime the field message is overwritten, by the updated message in the log
If you could please help me figure it out please?

My logstash conf

input {
  beats {
    port => 5044
    add_field => {
      "[@metadata][target_index]" => "index-[name]-log-%{+YYYY-MM-dd}"
      "[@metadata][target_document_id]" => "%{[log][file][path]}"
    }
  }
}

output {
  elasticsearch {
   action => "update"
   hosts => ["https://[host].northeurope.azure.elastic-cloud.com:*"]
   user => "elastic"
   password => "***"
   proxy => "***"
   index => "%{[@metadata][target_index]}"
   doc_as_upsert => true
   document_id => "%{[@metadata][target_document_id]}"
  }
}

version 7.17.2 Elasticsearch
version 8.1.3 Logstash
version 8.1.3 Filebeat

Thanks

doc_as_upsert have effect only when there is no previous document and newly indexing a document with update mode.
without doc_as_upsert, a separate upsert field is used to index a new document.
with doc_as_upsert, a doc field (which is commonly used to update existing document) is used to index a new document.

That is not a function to append a new value to an existing field. I have no good idea to achieve it. You may need an elastic filter plugin to get existing document and a ruby script filter plugin to merge new value to the existing array.

Why don't you index every new message as a new document and aggregate them if necessary? I recommend such way.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.