we have the current problem that for security reasons we have to run elk on premise without connection to the internet.
We now want to migrate the data of multiple installations to one "global" one .
Also the idea would be to do this on regular bases but only with the delta of the data since the last migration.
Is there a tool or concept for this kind of a problem ?
If you are able to mount an NFS directory to these nodes (maybe you have an internal one that isn't internet accessible?) then you could snapshot the data to this, and then add it as a repository and restore it to the new cluster.
If the global cluster is within your firewalls, you could also have the nodes join the global cluster, and then use allocation filtering to exclude the nodes you want to decommission, which will cause ES to move the data off of them onto your new cluster.
thanks for the reply.
Unfortunally there is no automated connection what so ever possible yet, we wanted to export the data and transfer it manual .
So for that szeanario you don´t see a possibilty ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.