So i'm using the Elasticsearch docker image to develop locally. Works great.
Now what i'd like to do as part of a
docker-compose script is to:
- Create an instance of Elasticsearch via docker image
- Create index
- Import data using Bulk API (e.g a JSON file in the source control repository)
Initially i was thinking i could have a Dockerfile that started from the Elasticsearch base image, then somehow used curl commands to index/add data, but the issue is the container isn't 'up' yet.
The ideal situation is i'd like to run a command like this:
docker-compose up my-elastic-search-db
Then everything is up and ready to go.
How do people solve this? I've seen a few posts that a few years old of people running scripts 'waiting' for it to be up.. so just seeing if this is still the way, or have Elastic made it easier.