Exporting Dashboards with script

Hi Team,
I am trying to export dashboards, searches and visualizations to kibana using script.
Now that mapping types is removed from indices, how can i push the saved objects.
i am using this below Curl command:

curl -s -XPUT 'http://localhost:9200/.kibana/doc/System_Navigation' -H "Content-Type: application/json" -d@/visualization/System_Navigation.json

I am saving all the dashboards seperately as JSON files using export api in management tab from kibana. The format they are saved in is not the format that is being accepted while usin above command..it's giving errors for metadat fields _id _type

"reason":"Field [_id] is a metadata field and cannot be added inside a document. Use the index API request parameters."}

[
      {
        "_id": "0441cfb0-faa1-11e7-92ec-876f140dfe8f",
        "_type": "search",
        "_source": {
          "title": "Metricbeat Search",
          "description": "",
          "hits": 0,
          "columns": [
            "_source"
          ],
          "sort": [
            "@timestamp",
            "desc"
          ],
          "version": 1,
          "kibanaSavedObjectMeta": {
            "searchSourceJSON": "{\"index\":\"9d139a80-e696-11e7-8461-6326e2e461fd\",\"highlightAll\":true,\"version\":true,\"query\":{\"language\":\"lucene\",\"query\":\"\"},\"filter\":[]}"
          }
        }
      }
    ]

my question is how is it possible to export them..do i need to manually format all the exported JSONs (dashboard, visuals, searches and index patterns) like below to match the mapping of .kibana index like below:

{
          "type": "search",
          "updated_at": "2018-01-11T09:51:49.685Z",
          "search": {
            "title": "VMs search",
            "description": "",
            "hits": 0,
            "columns": [
              "_source"
            ],
            "sort": [
              "@timestamp",
              "desc"
            ],
            "version": 1,
            "kibanaSavedObjectMeta": {
              "searchSourceJSON": """{"index":"532c80a0-f5e9-11e7-8143-dbeab0fdfcdf","highlightAll":true,"version":true,"query":{"query":"","language":"lucene"},"filter":[{"meta":{"index":"532c80a0-f5e9-11e7-8143-dbeab0fdfcdf","negate":false,"disabled":false,"alias":null,"type":"exists","key":"resource_metadata.state","value":"exists"},"exists":{"field":"resource_metadata.state"},"$state":{"store":"appState"}}]}"""
            }
          }
        }

Please help!!

Hi @Shubham_Mahajan,

You are correct.

The format used for import/export through the Saved Objects UI is different than when you're adding documents to an ES index.

The Saved Objects import process consumes raw ES documents, like:

{
    "_id": "PgSQL-transactions",
    "_type": "search",
    "_source": {
      "columns": [
        "method",
        "type",
        "path",
        "responsetime",
        "status"
      ],
      "description": "",
      "hits": 0,
      "kibanaSavedObjectMeta": {
        "searchSourceJSON": "{\"index\":\"packetbeat-*\",\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}}},\"filter\":[{\"meta\":{\"index\":\"packetbeat-*\",\"negate\":false,\"key\":\"type\",\"value\":\"pgsql\",\"disabled\":false},\"query\":{\"match\":{\"type\":{\"query\":\"pgsql\",\"type\":\"phrase\"}}}}],\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}}"
      },
      "sort": [
        "@timestamp",
        "desc"
      ],
      "title": "PgSQL transactions",
      "version": 1
    },
    "_meta": {
      "savedObjectVersion": 2
    }
  }

However, adding a document to a ES index just requires the source, which looks like:

{
  "type": "search",
  "search": {
    "columns": [
      "method",
      "type",
      "path",
      "responsetime",
      "status"
    ],
    "description": "",
    "hits": 0,
    "kibanaSavedObjectMeta": {
      "searchSourceJSON": "{\"index\":\"packetbeat-*\",\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}}},\"filter\":[{\"meta\":{\"index\":\"packetbeat-*\",\"negate\":false,\"key\":\"type\",\"value\":\"pgsql\",\"disabled\":false},\"query\":{\"match\":{\"type\":{\"query\":\"pgsql\",\"type\":\"phrase\"}}}}],\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}}"
    },
    "sort": [
      "@timestamp",
      "desc"
    ],
    "title": "PgSQL transactions",
    "version": 1
  }
}

For the latter, you specify the id as part of the URL, like: http://localhost:9200/.kibana/doc/PgSQL-transactions

Let me know if that helps!

Lemme know if that helps!

@chrisronline So basically i'm gonna have to manually modify each JSON object to the required format and post it..right??
Also if i push the dashboards and visualizations, they use specific ID's () to identify searches and dashboard so i'm gonna have to use that id for posting these objects right??

Yes, that sounds right @Shubham_Mahajan.

If you're exporting Saved Objects using the UI, you could just import them through the UI too and you wouldn't have to modify the json.

However, if you're manually exporting and then programmatically importing, you will need to modify the source.

Oh man it’s gonna be a lot of work😅.. Thanks a lot for the info, Chris!!

Good luck!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.