Automatic dashboard export from command line - using python

Many people has ask this questions in past. I have search around for same for many days and didn't find right answer or proper explanation.
I am not good in python but used it and implemented my solution.
posting it here for someone to use it if they like


import time
import datetime as dt
from datetime import datetime
import json
from requests.auth import HTTPBasicAuth
import requests
import ndjson
from io import StringIO

##### Some varibale which will be used throughout script
timestamp = datetime.now()
##### convert this timestamp to string, will use for filename
day_month_year = timestamp.strftime("%d-%b-%Y")
backup_dir='/root/dashboard_backups/'

##### this will be used for exporting dashboard + associated objects
data1 = '\n{ "objects":[\n   {\n "type":"dashboard",\n "id":'
data2 = '\n } ] ,\n "includeReferencesDeep": true\n}'

##### for calling request.get and request.post
headers = {
    'kbn-xsrf': 'true',
    'Content-Type': 'application/json',
}
params = (
    ('type', 'dashboard'),
)

for master in ["elkm01","elktstm01","elkdevm01"]:
   url=f'http://{master}:5601/'
   url_space=f'http://{master}:5601/api/spaces/space'

  ##### retrive all the spaces names from one master
   space_names = requests.get(url_space, auth=(elastic_user, elastic_password))
   ##### output is json but as array because it is between [  ] bracket
   space_names_array = space_names.json()

   ##### found multiple spaces, now loop it
   for i in range(1,len(space_names_array)):
      space_id=space_names_array[i]['id']
      space_title=space_names_array[i]['name']

       if (space_id == 'default'):
         ###### if id=default then retriving dashboard name,id is different url, set that up
         dashboard_url=url+"api/saved_objects/_find"
         dashboard_export_url=url+"api/saved_objects/_export"
      else:
         ##### Any other space and url should be different,
         ##### one for retriving dashbaord names and ids
         ##### second for exporting dashboard once you know the dashboard_id
         dashboard_url=url+"s/"+space_id+"/api/saved_objects/_find"
         dashboard_export_url=url+"s/"+space_id+"/api/saved_objects/_export"

      dashboard_names = requests.get(dashboard_url, params=params, headers=headers, auth=('elastic_user', 'elastic_password'))
      dashboard_name = dashboard_names.json()
      ##### output has three key,value pair, last one is saved_objects and vaule is array
      dashboard_names_array = dashboard_name['saved_objects']

      ##### found multiple dashboards for that one space, now loop it and export all of them
      for i in range(1,len(dashboard_names_array)):
         dashboard_id=dashboard_names_array[i]['id']
         ##### replace all space with _, and get the name of dashaboard, and set file name for output
         ##### filename for output
         dashboard_title=master+"_"+space_id+"_"+dashboard_names_array[i]['attributes']['title'].replace(" ","_")+".ndjson"+"_"+day_month_year
         dashboard_title=backup_dir+dashboard_title

         ##### now export that one individula dashboard
         data = data1 + '"' + dashboard_id + '"' + data2
        ##### Read from kibana now
         export_dashboard = requests.post(dashboard_export_url, headers=headers, data=data, auth=('elastic_user','elastic_password'))

         ##### use ndjson decoder and create one json per line the way kibana likes it
         ##### then you will be able to import back
         items = export_dashboard.json(cls=ndjson.Decoder)
         output = ndjson.dumps(items)
         print ("exporting - ", dashboard_title)
         f = open(dashboard_title, mode="w")
         f.write(output)
         f.flush()
         f.close()
quit()
2 Likes

if you have any suggestion on this will gladly take it

you can do it with a simple bash script as well.

#!/bin/bash

cd /opt/index_backup

for i in $(curl -s -k -u elastic_user:elastic_password -H "Content-Type: application/json" -XGET 'http://10.80.3.11:9200/_cat/indices/.kibana*' | awk '{print $3}'); do curl -s -k -o $i.json -u elastic_user:elastic_password -H "Content-Type: application/json" -XGET 'http://10.80.3.11:9200/'$i'/_search?&pretty=true'; done

Paula, Thanks for reply.

actually not. this one you can import it back.
plus using this you can't export in ndjson format which is needed for new 7.x version.

I just tested your command which is reading .kibana* index

it does not give you dashboard just all the visulization on that index

curl -s -k -u elastic_user:elastic_password -H "Content-Type: application/json" -XGET 'localhost:9200/.kibana_1/_search?&pretty=true' |grep '"type"'
"type" : "space",
"type" : "telemetry",
"type" : "index-pattern",
"type" : "index-pattern",
"type" : "visualization",
"type" : "visualization",
"type" : "visualization",
"type" : "visualization",
"type" : "visualization",
"type" : "visualization",

Where what I needed was to walk through all my spaces (10+) and export all the dashboard from all spaces including all related objects.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.