We are developing our project with the usage of docker / docker-compose. Elasticsearch and Kibana are a part of that setup. We'd like to ensure that each time we start our project with docker-compose , our saved index templates, saved searches, visualizations and dashboards are there for us.
Can you please provide an example on how can this be achieved?
Unfortunately that's hardly helpful. We're talking about a development environment, where persistent storage is the least of our worries.
We are trying to find a solution to a scenario where (for example)
you do a clean checkout of code from the repository to your workstation
you have elasticsearch and kibana inside your docker-compose.yml
you run a docker-compose up
you immediately have access to all of your preferred searches/visualizations/dashboards etc.
I realize that those items may need to be exported as json beforehand, however we still haven't found an optimal/recommended way to load it back into ES on startup.
If you do not have the data available on disk you probably need to orchestrate loading it once the instances are up, either by loading JSON or restoring an existing snapshot that contains the data you need. There is no automatic built in way to load data into a clean cluster on startup as far as I know.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.