Automated import of Kibana dashboards using API

Hi All,

We have 2 clusters in openshift which uses default EFK stack. We are looking to import the dashboards from kibana from cluster1 to kibana cluster2. I know its simple when we do manually, is there a way to send exported JSON to kibana in 2nd cluster?

I found the documentation to be very generic and bit lost at the moment.

Please suggest.

Regards,Sridatta

Hey @Sridattam

Yes you can use the export/import API for that:
https://www.elastic.co/guide/en/kibana/current/dashboard-import-api-export.html
https://www.elastic.co/guide/en/kibana/current/dashboard-import-api-import.html

It's important to note that you use the same method to export a dashboard and import it, as if I remember correctly there can be some issues when importing a dashboard from the UI exported via API or viceversa. So it's important to import via API only dashboard exported via API and/or import via UI only dashboard exported via UI.

Said that, you should use the API:

  • to export a dashboard first you need the dashboard ID. That ID can be seen on the browser URL when opening a dashboard (something like: http://localhost:5672/app/kibana#/dashboard/edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b )
    or using the Saved Objects page in the Management app of Kibana, you can there look for your dashboard and on the URL you will see that ID: http://localhost:5672/app/kibana#/management/kibana/objects/savedDashboards/edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b. You can also search for that id directly via elasticsearch on the .kibana index.
GET .kibana/_search
{
  "size":100,
  "query": {
    "match": {
      "type": "dashboard"
    }
  }
}
  • than you can use the following API to get the export it (execute this outside kibana dev tools, like from cURL)
GET api/kibana/dashboards/export?dashboard=edf84fe0-e1a0-11e7-b6d5-4dc382ef7f5b

It will result in a json object that describe the dashboard.

  • Than import the resulting payload on the other kibana with
POST api/kibana/dashboards/import
{the body of the response from the export function}

There can be case where you don't want to import all the exported objects, like you don't want to import the index-pattern for example because you already created one with the same name on a different cluster, in that case you can specify that on the request URL like ?exclude=index-pattern

I hope that is a bit more useful now. Let me know if that help

Thanks @markov00 , We are using Kibana and elastic v5.6.13 this approach did not work as there is no API support in kibana. So we had to go with below approach by using the API exposed in elastic service.

  1. Get the json data of existing objects using curl as below.

Custom Index patters:
curl -s -k --cert /home/adminuser/admin-cert --key /home/adminuser/admin-key -H 'Content-Type: application/json' -XGET 'https://logging-es.openshift-logging.svc:9200/.kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1/index-pattern/'

Visualizations:
curl -s -k --cert /home/adminuser/admin-cert --key /home/adminuser/admin-key -H 'Content-Type: application/json' -XGET 'https://logging-es.openshift-logging.svc:9200/.kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1/visualization/_search'

Search Patterns:
curl -s -k --cert /home/adminuser/admin-cert --key /home/adminuser/admin-key -H 'Content-Type: application/json' -XGET 'https://logging-es.openshift-logging.svc:9200/.kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1/search/_search'

Dashboards:
curl -s -k --cert /home/adminuser/admin-cert --key /home/adminuser/admin-key -H 'Content-Type: application/json' -XGET 'https://logging-es.openshift-logging.svc:9200/.kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1/dashboard/_search'

  1. Then format the output something like below and store it in a single file eg template.json. i have given sample of each object but the output will be very big.

{ "index" : {"_index":".kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1","_type":"index-pattern","_id":"AWurxVv2_AqGHLOc5UaG" } }
{"title":".audit*","timeFieldName":"metadata.creationTimestamp","notExpandable":true,"fields":"[{******}] }

{ "index" : {"_index":".kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1","_type":"visualization","_id":"AWt0bhz8yGHtJqIwSwON" } }
{"title":"Application Logs Count Histogram By Severity","visState":"{"title":*******}}

{ "index" : {"_index":".kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1","_type":"search","_id":"AWtKT4IMoDMa3xzxm3C-" } }
{"title":"Product Logs Detailed View","description":"","hits":0,"columns":["Project_Name","Application_Component","level","message"]*****}

{ "index" : {"_index":".kibana.4322d83fbeb45aec24ebf9df53641cf2decff6c1","_type":"dashboard","_id":"AWtvpcn7NVl7Ab8DqjqQ" } }
{"title":"Product Logs","hits":0,"description":******}

  1. Then use this template as standard and default in our another EFK cluster. Import using curl as below. We are using the _bulk API from elastic to import data and also as we are passing JSON with new line breaks we have to use --data-binary for curl.

curl -s -k --cert /home/adminuser/admin-cert --key /home/adminuser/admin-key -H 'Content-Type: application/json' -XPOST 'https://logging-es.openshift-logging.svc:9200/_bulk' --data-binary "@template.json"

Just wanted to share as this may be helpful for others who are running older versions, our objective was to have a default template for automated installation of the EFK stack and post that we should have default dashboards visible to users by importing this template on any new setup.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.