I have recently inherited an ELK stack 1.3.2/1.4.2/3.01 which will be moved to new platforms and upgraded at the same time. Is there any documentation around upgrading something so old ?
Also, I'd want to remove as much old data as possible, so is there a setting in the old versions of the software that will allow me to 'age-out' some data ?
I've tried to find the documentation for these early versions but it seems to keep keeping to the latest version so I'm not sure if the information is correct.
I would say the easiest approach would be to spin up an entirely new cluster and migrate the data doing a reindex, you could use logstash to reindex your data from the old cluster to the new cluster.
But a lot of things changed from these old versions, the mappings changed, you can't have multiple types in the same index anymore and a lot of other things.
You will need to check the breaking changes for every major version since your current version to see what affects you.
In the documentation page you can select to see the documentation of a specific version, for example these are the breaking changes for version 2.0.
Thanks for your response Leandro. I suspected that this would not be simple and was thinking that we would need to build a new instance and just let the old one fade away as the data aged out. We only have a life of 2 or 3 months on the data anyway.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.