We have a six node cluster on which we have date partitioned logstash data going back 3 or so years. All of these indexes but the last six months of indexes are closed, but we wish to keep the data around for the foreseeable future. We recently upgraded the storage on three of the nodes and wish to make the other 3 nodes dataless, but these that will become dataless still have shards of the closed indexes.
It's tedious and error prone to try to open each index and back it up and attempts to automate the process have resulted in performance problems. Is there a way to consolidate these shards on to the three data nodes without bringing them online? Is it safe to simply copy the data over outside of elasticsearch?