Reindexing a big index without down time

Hi.

I have a really big index with 100 millions of documents.
I want to add some new fields, change existing ones and delete the redundant ones by applying explicit index. I will also use alias to switch the indiced.

--> existing_index - ALIAS
--> new_index -> ALIAS

But the problem is that I am pulling the new documents continuously from a database. In this case I have to stop Logstash to write to existing index, then reindex it and switch alias.

Is there any better way to do it without stopping Logstash?

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.