Export and import via curl kibana's dashboards and visualizations

hi there,

I'm trying to get all the dashboards into json without the UI of the kibana itself (using curl).

at the moment , I succeed getting the dashboards but to open them on another instance I need to have the visualizations inside as well.
My problem is to get the visualizations from kibana and export them to json via curl...
how i can do it? ideas?

one more thing, using the Export Everything on the kibana didn't gave me any json file or anything at all.



Hi @LiorK,

I am also trying to import dashboard using kibana. Can you tell me how you succeeded as I am trying
curl -XPUT http://localhost:9200/.kibana/dashboard/testdashboard -d '{[
"_id": "D1",
"_type": "dashboard",
"_source": {
"title": "D1",
"hits": 0,
"description": "",
"panelsJSON": "[{"id":"V1","type":"visualization","panelIndex":1,"size_x":3,"size_y":2,"col":1,"row":1}]",
"optionsJSON": "{"darkTheme":false}",
"uiStateJSON": "{}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{"filter":[{"query":{"query_string":{"query":"*","analyze_wildcard":true}}}]}"
and getting error as :
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1024 100 469 100 555 14656 17343 --:--:-- --:--:-- --:--:-- 17343{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"json_parse_exception","reason":"Unexpected character ('[' (code 91)): was expecting either valid name character (for unquoted name) or double-quote (for quoted) to start field name\n at [Source: org.elasticsearch.common.io.stream.InputStreamStreamInput@77224b66; line: 1, column: 3]"}},"status":400}

Kindly suggest some points.

I use a Python script that pulls the dashboards I want from ES, enumerates the panels in the JSON of the dashboard, grabs those visualisations from ES and then looks for the index-patterns each of the visualisations uses and grabs those. Once I have all these documents I index them to the target index/ES host.

Right now I'm using an index template to avoid dynamic mapping issues (which I have seen), in the future I'll be pulling the mapping from the source index which will be more robust in supporting new Kibana features and plugins.

currectly i have script importing all the dashboards, visualisations and searches. and exporting to another .kibana index.


  • Note- gettin the dashboards alon will not help you if it has visualisations and also visualisations that uses searches. you will need those objects as well.
  • your dashboard D1 hav visualisations inside? can you try to reach the following url http://localhost:9200/.kibana/dashboard/D1/_source

@pemontto can you please explain how dynamic mapping is related to this?

Assuming you understand Elasticsearch dynamic mapping, an examples where this goes awry is where Kibana has a time field, specifically the "timeFrom " and "timeTo" in the dashboard document type. These fields can be a full ISO8601 date or relative like "now-3y".

If Elasticsearch first sees a document with an ISO8601 date the field will be dynamically set to a date field, now when you attempt to index that document with the "now-3y" you'll get a data type mismatch and it will fail to index all those dashboards.

So the solution is to either copy across the mapping from the source index, or use an index-template. Again I'd suggest updating the mapping from the source for future proofing.

I tried https://github.com/elastic/beats-dashboards but problem with this is that it does not support source meta data ("_id") and thus It is not usable as i was using meta data for uniqueness of visualization and their mapping in dashboard.So can suggestv points on that.

I would post this question in the Beats channel: https://discuss.elastic.co/c/beats

How to export and import he pre configured dashboards Kibana ( 5.4.0)
can pls anyone help me