We have 2 clusters in openshift which uses default EFK stack. We are looking to import the dashboards from kibana from cluster1 to kibana cluster2. I know its simple when we do manually, is there a way to send exported JSON to kibana in 2nd cluster?
I found the documentation to be very generic and bit lost at the moment.
It's important to note that you use the same method to export a dashboard and import it, as if I remember correctly there can be some issues when importing a dashboard from the UI exported via API or viceversa. So it's important to import via API only dashboard exported via API and/or import via UI only dashboard exported via UI.
Said that, you should use the API:
to export a dashboard first you need the dashboard ID. That ID can be seen on the browser URL when opening a dashboard (something like: http://localhost:5672/app/kibana#/dashboard/edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b )
or using the Saved Objects page in the Management app of Kibana, you can there look for your dashboard and on the URL you will see that ID: http://localhost:5672/app/kibana#/management/kibana/objects/savedDashboards/edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b. You can also search for that id directly via elasticsearch on the .kibana index.
than you can use the following API to get the export it (execute this outside kibana dev tools, like from cURL)
GET api/kibana/dashboards/export?dashboard=edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b
It will result in a json object that describe the dashboard.
Than import the resulting payload on the other kibana with
POST api/kibana/dashboards/import
{the body of the response from the export function}
There can be case where you don't want to import all the exported objects, like you don't want to import the index-pattern for example because you already created one with the same name on a different cluster, in that case you can specify that on the request URL like ?exclude=index-pattern
I hope that is a bit more useful now. Let me know if that help
Thanks @markov00 , We are using Kibana and elastic v5.6.13 this approach did not work as there is no API support in kibana. So we had to go with below approach by using the API exposed in elastic service.
Get the json data of existing objects using curl as below.
Then format the output something like below and store it in a single file eg template.json. i have given sample of each object but the output will be very big.
Then use this template as standard and default in our another EFK cluster. Import using curl as below. We are using the _bulk API from elastic to import data and also as we are passing JSON with new line breaks we have to use --data-binary for curl.
Just wanted to share as this may be helpful for others who are running older versions, our objective was to have a default template for automated installation of the EFK stack and post that we should have default dashboards visible to users by importing this template on any new setup.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.