Hi,
I am running the Docker-Elk Stack 7.14.1 and noticed that in the Stack Monitoring Tab Elastic reports it only has a 50 GB disk.
The host running the container has 700 GB free. How can I expand this container so it can use all the available free space on the host?
Because of the size limitation, the disk fills up in days that is a problem because I can not store more data even with the free space available.
At the moment I am using the default docker-compose file with the volumes assigned. The data persist but is limited to 50 GB.
@stephenb is correct. The one thing I would add is that the Elasticsearch docker image will run as user ID and group ID 1000. So make sure that this user owns the path.
Then bind mount this path to the Elasticsearch container. The following example is from one of our docker compose files (you can also see how we mounted the path to certificates)...
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.