Add new field to existing documents in an index from logstash is recreating the index


My setup includes using logstash 7.2.0 to fetch records from database and push to Elasticsearch.

I have a requirement to add new field to existing documents in an index plus index new documents with the new field via logstash.

I added the field in the jdbc query of logstash jdbc plugin and ran the logstash plugin.

It seems that the existing documents are getting deleted and recreated. I observed it by fetching count of records every 30 seconds.

Is this the expected behavior or am I missing something? pls help.


Did you ever find any any additional information on this?

I have a similar issue. I can update the document manually with the update api call but using the Logstash Output has the same issue you listed here. The new fields are mapped correctly but its not displayed. I have also tried the "update" action.

Found the issue with this on my side.

The additional fields were being added correctly using the aggregate function but I had a mutate method that was removing the fields after formatting. Once that was resolved, the records and new fields were being updated as normal.

Both the default insert action and update action worked for the Elasticsearch output method.

That is great ! I am just letting the index recreation to happen. Could not find an alternative.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.