So i'm using the Elasticsearch docker image to develop locally. Works great.
Now what i'd like to do as part of a docker-compose script is to:
Create an instance of Elasticsearch via docker image
Create index
Import data using Bulk API (e.g a JSON file in the source control repository)
Initially i was thinking i could have a Dockerfile that started from the Elasticsearch base image, then somehow used curl commands to index/add data, but the issue is the container isn't 'up' yet.
The ideal situation is i'd like to run a command like this:
docker-compose up my-elastic-search-db
Then everything is up and ready to go.
How do people solve this? I've seen a few posts that a few years old of people running scripts 'waiting' for it to be up.. so just seeing if this is still the way, or have Elastic made it easier.
Can you please elaborate on how healthcheck can help?
When and where would my 'curl' occur to insert the data? It can't be in the custom Dockerfile for the elasticsearch image right? It would need to be after the healthcheck..
So.. would i just have another 'service' in docker compose that depends_on the elasticsearch image, and this 'service' is actually just a script that uses curl to insert the data?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.