We are using Logstash mysql_jdbc connector to fetch data from Database and push them to Elasticsearch index. We scheduled a job working every minute for delta imports where Logstash queries DB for changes in the last minute.
But occasionally (one or two days) we would like to import all data to a new index and change our current index with the new one using alias.
I would like to ask, how can we do it with Logstash. I mean, how can we understand that all data fetched and new index is ready before changing alias.