We are developing our project with the usage of docker / docker-compose. Elasticsearch and Kibana are a part of that setup. We'd like to ensure that each time we start our project with docker-compose , our saved index templates, saved searches, visualizations and dashboards are there for us.
Can you please provide an example on how can this be achieved?
All Kibana data is stored in Elasticsearch, so you must make sure that your Elasticsearch docker instance stores all data on persistent storage.
Unfortunately that's hardly helpful. We're talking about a development environment, where persistent storage is the least of our worries.
We are trying to find a solution to a scenario where (for example)
- you do a clean checkout of code from the repository to your workstation
- you have
kibana inside your
- you run a
- you immediately have access to all of your preferred searches/visualizations/dashboards etc.
I realize that those items may need to be exported as
json beforehand, however we still haven't found an optimal/recommended way to load it back into ES on startup.
Can you show us a recommended way to do this?
If you do not have the data available on disk you probably need to orchestrate loading it once the instances are up, either by loading JSON or restoring an existing snapshot that contains the data you need. There is no automatic built in way to load data into a clean cluster on startup as far as I know.
Let's assume it's under source control. What would be the best thing to do? use a 3rd party container with
Yes, using a custom container for initialisation might be a good option.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.