Sure! I have a marketplace with 30k products. My main database is a mysql, mirroring async into appsearch with API calls for search purposes.
Those products are constantly being created, modified, deleted, rejected, paused, resumed, past due date, manually deleted because of duplications or account suspensions, and many other operational situations. Many of them happen in my API, and some of them are still manual updates directly in the database.
Every API call I receive triggers an Appsearch API call in order to apply that change.
Since I have so many different situations that alter the product set, and still having a lot of modifications directly in mysql, I want to make a fully initial load everyday with logstash, just to be sure that at the begining of the new day, my product set will be updated.
Otherwise, If I just miss a DELETE because of a bug, direct modification in mysql, or any other reason, that product will be forever in my resultset, and that can be dangerous.
I tried using logstash for ETL instead of appsearch API calls, but its output plugin lacks DELETE and PATCH requests, just PUSH, so it falls short for my needs.
A couple of minutes of downtime aren't a problem since I have a failover search method directly to mysql during the couple of minutes the initial load is happening.
It won't be a problem to delete an engine and recreate it. I don't have any curations nor synonims , and the boosts and wheights are specified in the JSON requests. The problem is that after deleting the engine, and recreating the engine and schema, after a couple of minutes every POST request begin returning error 500, without any clue in the API Logs. The only solution is to destroy the elasticsearch and appsearch containers (I'm using docker) and recreate them
I hope I was clearly enough