How can I permanently save Elasticsearch data in Azure while using Docker, ensuring that the data remains intact even if the container is stopped or restarted?
Hello, and welcome! In addition to Stephen's notes below, please refer to Snapshot and restore | Elasticsearch Guide [8.17] | Elastic to backup your data.
Hi @Constellation2025 Welcome the community.
I recommend reading the production notes here carefully
Especially about binding data volumes
Always bind data volumes
edit
You should use a volume bound on /usr/share/elasticsearch/data for the following reasons:
The data of your Elasticsearch node won’t be lost if the container is killed
Elasticsearch is I/O sensitive and the Docker storage driver is not ideal for fast I/O
It allows the use of advanced Docker volume plugins
If you don't on know about docker volumes you should read that
I am using volumes like - es_data:/usr/share/elasticsearch/data but still losing data on every restart
Then you are not setting up the volumes correctly or some how running a cleanup volumes...
Are you following these instructions?
I run this over and over and down compose down and the compose up and the data remains.
You will need to share you entire compose and all commands you are running, what OS etc..etc..
Not sure what is happening when it reaches max storage on restart, why does it remove anything in elastic in azure. I have data to store more than 275505879