Automatic Kibana index backup

I'm trying to find the way to backup Kibana indexes in automatic manner without creating a snapshot. Is there a way to do that? Something like single command: curl http://es:9200/.kibana/dump_all?

I found the way to export data:

curl -XPOST "http://es:9200/.kibana/_search?size=1000&pretty" -d"{"query":{"match_all":{}}}" > kibana_export.json

But when I try to import it, it requires type. If I specify the type, then it imports data as a subelement of the specified type:

cat kibana_export.json | jq -c '.[] | {"index": {"_index": ".kibana1", "_type": "dashboards"}}, .' | curl -XPOST "http://es:9200/_bulk" --data-binary @-

How can I just import what I have?

You'll only want to import the source field on the second line, not all the extra metadata. The extra metadata can be used to help map the action/metadata/first line of the bulk api, specifically you'll want to pull _index, _type, and _id.

Example:

{
        "_index" : ".kibana",
        "_type" : "config",
        "_id" : "5.1.2",
        "_score" : 1.0,
        "_source" : {
          "defaultIndex" : ".kibana"
        }
      }

becomes

{ "index": { "_index": ".kibana", "_type": "config", "_id": "5.1.2" } }
{ "defaultIndex" : ".kibana"}

Also before using the bulk API, you'll want to make sure you've configured mappings correctly.

I understand that but I still don't know how to automate this transformation using jq.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.