Reindex data on old ES version using Logstash

Hey there,
I am trying to reindex data from an old ES version to another old ES version using Logstash with elasticsearch input.
Anyway I have some doubts. If I schedule the action, during the second execution it will update documents in the target index? Does it will start from the beginning every time?


If you use a schedule it will re-run the same query. If the query does not restrict the data it fetches then it will fetch the same data again and the same documents will be indexed. There is no functionality like a sincedb for the elasticsearch input.

Ok, probably I could use a filter using timestamp field? Or... Is there another way to get my purpose?

just another doubt: if I don't specify any query, does the first run will execute a query match_all?
if in the meantime the source index will be updated with other new docs, does Logstash input will get them or not? I mean, the first run will work as a "snapshot" (not literally of course)?

The query will fetch whatever data matches the query at the time it executes. It does not fetch any later updates that match it after it executes.

1 Like

using this approach, I saw that I cannot export _ttl value from source documents to destination indices.
is it true?
into the source index's mapping, I set a default _ttl value so I would like to export it, otherwise my destinations documents will have the _ttl value reset.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.