We recently had two servers go down due to hardware failure. When we added the two servers back to the environment the indices stored on the nodes did not automatically update with the indices that were replicated on other nodes which created dangling indices leaving us with two indices with the same name. We chose to delete the indices to bring the system green again but now need to restore the files from stored compressed files. The indices were created using version 6.6.1. After the deletion we upgraded to version 7.9. What my customer is asking for me to do is to restore the data from compressed files from our SAN. We usually ingest our data through connectors (sensors). I have never done this before. This is a closed network so I can't provide screenshots. We also have not incorporated snapshots. Which I plan to do after this incident. Any suggestions? Any help is appreciated.
Unfortunately there is not much you can do, the only officialy supported way to backup and restore elasticsearch indices is using the snapshot and restore API.
What you can try, but with not much hope that it will work, is to unzip the data you have in a differente path, create a entire new cluster using version
6.6.1 and configure this cluster to point to this data directory, your path needs to have the same structure of a elasticsearch data path.
If this works, you will have access to the data and can use logstash or other tool to reindex this data in your current cluster, but this is no guarantee that it will work, and there is no documentation about it from elastic, since it is not officialy supported.
Thank you for the feedback. I plan to see if I can unzip the files and convert to a json file. Hopefully this will allow me to use the API.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.