Kibana import API for deployment

Hi,

I have been using ELK for a couple of weeks and have created a first solution, which I would like to distribute to promote ELK in my organisation. Looking through the discussions I haven't found a way to successfully export and then later import my Kibana objects on another system.

For my dashboards, the Kibana 6.4 the export/import UI works fine. I can export Saved Objects and import them later on another system. I am fine with exporting my saved objects from the Kibana management screen, but would like to automate deployment for the users. This includes installing the Saved Objects. I would like to use the import API. It is marked experimental, but even so the output from the Export seems to be incompatible with the import API. The export is an array, and has no surrounding "objects". Also, the export uses "_id ", "_type", and "_source" instead of "id", "type" and "attributes" as required by the 6.4 documentation. I have tried changing the names, but that makes Kibana hang. Not sure if there is a solution here. But distributing your Kibana dashboards along with your ELK solution seems like a must-have feature.

Also, I would prefer to make Kibana completely ready to use. Which includes setting the default index as well. Not sure if this is possible.

Any suggestions are welcome.

Cheers
-Ken

Hi Ken,

did you have a look at the .kiabana index in Elasticsearch? This is where Kibana stores all index-patterns, dashboards, visualizations, etc. If you copy that to another instance, everything should work there just the same.

Thinking about automation, you could also think about modifying the documents in that index before reimporting them. That way you would be able to modify things like dashboard names, index patterns, everything you like!

Is this what you are looking for?

Cheers,
Frederik

Thanks for the response, Frederik. I have already looked at the .kibana index. Mainly to set the default. I have also tried modifying the exported JSON to make it importable. However, as I mentioned Kibana then hangs. I was really looking for a sanctioned way of doing this, in order to avoid compatibility issues later on. My current only way is to mention in the instructions how to import it manually, and how to enable the index. It just seems quite primitive that we cannot deploy an ELK solution on other VMs. I'm using the docker container in order to package everything up nicely, and then I have this weird thing where we cannot deploy automatically. In this devops day and age it just seems weird. And it looks like the ES guys have gone through the trouble of implementing these APIs, but then left them and the doco broken. So I was looking to see if anyone had managed to get around these bugs.

Hi Ken,

I am not sure this is how it's supposed to work. You can simply index the documents in your docker container like described in the documentation via curl in your deploy scripts. When docker has set up ES, just curl to that instance like
curl -X PUT "localhost:9200/twitter/_doc/1" -H 'Content-Type: application/json' -d'
{
"user" : "kimchy",
"post_date" : "2009-11-15T14:12:12",
"message" : "trying out Elasticsearch"
}
'
with the document you got out of the .kibana index. This is how I set up my deployment scripts :slight_smile:

Alright. So you got the Kibana dashboard out of the .kibana index? As I said I used the export UI. Any change of showing me an example of how to get all the dashboards, visualisations etc. out without missing something? Sorry to bug you.

(Are you by any chance danish? You seem to have a danish name, and I am a dane living in Sydney :slight_smile:

No Problem. There are a few ways to do that. If you want everything inside an index, maybe the snapshot api would be best for you, as this is intended for backups and restores.

If you are using curl, you can get things from the kibana index by doing something like http://localhost:9200/.kibana/_search?pretty=true&q=:&size=100, which would give you 100 documents from the kibana index. Adjust the size to your needs. If you want the documents one by one, use scrolling.

Once you have the export, you can put them in the new instance by curl as well. I don't know how much that would be in your case, as it could bloat your docker scripts. This could be iterated nicely though.

(Sorry, I'm German, with the name coming from France AFAIK :slight_smile: )

Thank you again. I managed to get the saved objects out using another approach:

curl -s -X GET "localhost:5601/api/saved_objects/_find?type=search" -H "Content-Type: application/json" -H "kbn-xsrf: true" | python -m json.tool

using the different types: search, visualization, dashboard, index-pattern. This result looks very similar to the UI export, but the structure is slightly different. I'll give it a go with the bulk create API. Fingers crossed.

Cheers

Sounds good, best of luck!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.