If it is possible to save the dashboard to a json file then I also wonder how to load this?
Is it enought to print them with curl and and then load them back using curl XPUT?
There is a tool built in to Kibana for this. If you go to the Management tab, Saved Objects, you can export and then later import whatever you want (including everything).
If you export a Dashboard, you would also have to export the Visualizations and/or Saved Searches that are used in that Visualization. And then re-import those.
You could export everything and edit that JSON to contain only the things you want to import later.
Export does NOT export index patterns, so you would have to re-create any index patterns used by the Saved Searches and Visualizations using the same name as before.
On the clean installation, you should create the index pattern first, then import the saved objects.
Do you know how to import these directly to elastic search using curl?
I have created templates for index patterns which I load at installation of logstash.
I would like to do something similar for dashboards and visualizations.
I'm able to get the _source of a dashboard like this;
GET .kibana/dashboard/_search
{
"query": {
"match": {"_id": "Packetbeat-Cassandra"}
}
}
But the source is embedded in the whole response and you'd have to parse that out to be able to PUT it back in.
An alternative is a tool called elasticdump https://github.com/taskrabbit/elasticsearch-dump. This tool was not written by Elastic and is not supported, but I've used it and it works pretty well.
It makes it pretty easy to save the whole .kibana index and load it back in, or just one dashboard like this;
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.