Hi - we're having an internal debate about the best way to automate the build/deploy process of our .kibana index.
Initially, some team members tested the approach of simply copying the .kibana folder in its entirety from C:\elasticsearch\elasticsearch-1.5.0\data\elasticsearch\nodes\0\indices (yes, this is on Windows) to the same location on the new server. Interestingly enough, this worked (in that all of the saved searches, visualizations, and dashboards appeared accurately), but only if it was done before any data was loaded to our main ES index. If it was done after, then none of the searches, visualizations, or dashboards showed up properly.
Other team members are arguing that we should only be using either the Snapshot and Restore queries, or a utility like elasticdump to move indices from one server to another. The justification here is that companies like Elastic and TaskRabbit wouldn't have invested so much time on utilities to do this if the answer was as simple as moving a directory from one server to another.
Can anyone help us with a more informed opinion? We've read through a series of articles about how ES manages the underlying file structure, but haven't found anything that clearly explains what can be accomplished simply through the underlying files vs. what must be done through existing APIs or utilities.