How to store kibana dashboard in docker container?

If the kibana is storing the dashboard in ES .kibana index then how can i save the dashboard so that users after spinning up the ELK using docker-compsoe can access the dashboard? I know volume is used for persistent storage, but I am talking about scenario in which this docker compose setup should be run on different setups in which storing dashboard on volume would not be an option.

[root@onw-kwah-2v logstash]# cat pipeline/bp-filter.conf
input {
  file {
    path => "${INPUT1}/syslog.log*"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  grok {
    match => {"message" => ["%{TIMESTAMP_ISO8601:logdate} %{HOSTNAME:hostname} %{WORD:conatiner_name}: %{GREEDYDATA:[@metadata][messageline]}",
    "%{TIMESTAMP_ISO8601:logdate} %{HOSTNAME:hostname} %{WORD:container}\[%{INT:haprorxy_id}\]: %{GREEDYDATA:[@metadata][messageline]}"]}
  }
  if "_grokparsefailure" in [tags] {
    drop {}
  }
  mutate {
    remove_field => ["message", "@timestamp"]
  }
  json {
    source => "[@metadata][messageline]"
  }
  if "_jsonparsefailure" in [tags] {
    drop {}
  }
  date {
    match => ["logdate", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ"]
  }
}
output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    index => "logs-%{+yyyy-MM-dd}"
    document_type => "applicationlogs"
  }
}

Hi there,

You can export your dashboard into a JSON file and create a script that sends a request to Kibana's "import dashboard" API endpoint with this file as a payload. Here's an example of how docker-compose can run a script like this: https://github.com/elastic/stack-docker/blob/master/docker-compose.yml#L116. Here is some information on the "import dashboard" API: https://github.com/elastic/kibana/issues/14872.

Does this help?

Thanks,
CJ

1 Like

Thanks for sharing this information.

I exported the dashboard from one of the existing system. It was an array, I tried to import it but it throws me an error. Removed the [ ], and verified that it's a valid json.

{
    "_id": "dash1",
    "_type": "dashboard",
    "_source": {
      "title": "dash1",
      "hits": 0,
      "description": "",
      "panelsJSON": "[{\"col\":1,\"id\":\"filter_error\",\"panelIndex\":1,\"row\":5,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"},{\"col\":1,\"columns\":[\"_source\"],\"id\":\"log_search\",\"panelIndex\":4,\"row\":9,\"size_x\":12,\"size_y\":14,\"sort\":[\"timestamp\",\"desc\"],\"type\":\"search\"},{\"col\":1,\"id\":\"visual_logcount_timestamp\",\"panelIndex\":6,\"row\":1,\"size_x\":12,\"size_y\":4,\"type\":\"visualization\"},{\"col\":7,\"id\":\"log_split_priority\",\"panelIndex\":9,\"row\":5,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"}]",
      "optionsJSON": "{\"darkTheme\":true}",
      "uiStateJSON": "{\"P-1\":{\"vis\":{\"legendOpen\":false}},\"P-6\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-7\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-9\":{\"vis\":{\"legendOpen\":false}}}",
      "version": 1,
      "timeRestore": false,
      "kibanaSavedObjectMeta": {
        "searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}]}"
      }
    }
  }

But still the import is getting failed. I tried to remove the id, type and source but no luck.

docker@elk-machine1:~$ curl -XPOST localhost:5601/api/kibana/dashboards/import -H 'kbn-xsrf:true' -H 'Content-type:application/json' -d @my-dashboards.json
{"statusCode":400,"error":"Bad Request","message":"\"_id\" is not allowed. \"_type\" is not allowed. \"_source\" is not allowed","validation":{"source":"payload","keys":["_id","_type","_source"]}}
docker@elk-machine1:~$

Note: I have exported the dashboard json from the kibana UI instead of using the curl but I believe that shouldn't make a difference. Also, like mentioned in the doc which you have shared that we can export the dashboard using ID but I don't see any id of the dashboard neither in kibana nor in ES index. Do you know from where we can get the kibana dashboard unique id.

{
      "_index" : ".kibana",
      "_type" : "dashboard",
      "_id" : "dash1",
      "_score" : 1.0,
      "_source" : {
        "title" : "dash1",
        "hits" : 0,
        "description" : "",
        "panelsJSON" : "[{\"col\":1,\"id\":\"filter_error\",\"panelIndex\":1,\"row\":5,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"},{\"col\":1,\"columns\":[\"_source\"],\"id\":\"log_search\",\"panelIndex\":4,\"row\":9,\"size_x\":12,\"size_y\":14,\"sort\":[\"timestamp\",\"desc\"],\"type\":\"search\"},{\"col\":1,\"id\":\"visual_logcount_timestamp\",\"panelIndex\":6,\"row\":1,\"size_x\":12,\"size_y\":4,\"type\":\"visualization\"},{\"col\":7,\"id\":\"log_split_priority\",\"panelIndex\":9,\"row\":5,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"}]",
        "optionsJSON" : "{\"darkTheme\":true}",
        "uiStateJSON" : "{\"P-1\":{\"vis\":{\"legendOpen\":false}},\"P-6\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-7\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-9\":{\"vis\":{\"legendOpen\":false}}}",
        "version" : 1,
        "timeRestore" : false,
        "kibanaSavedObjectMeta" : {
          "searchSourceJSON" : "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}]}"
        }
      }
    }

You'll need to use the CURL export command to export the dashboard in a format appropriate for the CURL import command. Here's what the output json file will look like:

{
  "version": "7.0.0-alpha1",
  "objects": [
    {
      "id": "2a895040-7aed-11e8-b18c-abda6205a949",
      "type": "dashboard",
      "updated_at": "2018-06-28T16:06:07.428Z",
      "version": 1,
      "attributes": {
        "title": "New Dashboard",
        "hits": 0,
        "description": "",
        "panelsJSON": "[]",
        "optionsJSON": "{\"darkTheme\":false,\"useMargins\":true,\"hidePanelTitles\":false}",
        "version": 1,
        "timeRestore": false,
        "kibanaSavedObjectMeta": {
          "searchSourceJSON": "{\"query\":{\"query\":\"\",\"language\":\"lucene\"},\"filter\":[]}"
        }
      }
    }
  ]
}

The id field here is the UUID. It's the same value you'll see in the URL when viewing this dashboard, e.g. http://localhost:5601/app/kibana#/dashboard/2a895040-7aed-11e8-b18c-abda6205a949.

CJ

Sorry somehow I lost the track of this issue. In my case I can't see the UUID of dashboard but my dashboard has name i.e dash1 and I tried to use the dashboard name to download it.

# curl -XGET localhost:5601/api/kibana/dashboards/export?dashboard=dash1 > my-dashboards.json
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    38  100    38    0     0    215      0 --:--:-- --:--:-- --:--:--   215

But it's showing me:

# cat my-dashboards.json
{"statusCode":404,"error":"Not Found"}

i am able to see the dashboard ui id in my test lab ELK but in production i am not able to find the way to get the dashboard ID.

On lab system i am able to do export and import of dashboard.

Hi Vikrant, sorry I didn't realize you were using Kibana 4. This functionality was added to Kibana 5.5 (more background here.

If you can't upgrade Kibana, then you might need to export the dashboard using the export UI, and ask that your end users import it manually using the same UI. Does that make sense?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.