Manually loading index-patterns and dashboards into kibana after exporting from Beats

Hi,

Due to the way the network is structured, beats cannot directly connect to Elasticsearch and Kibana.
Also, salt is being used to setup the elastic stack.

It is trivial to upload the index templates to Elasticsearch, these can be exported from beats using the
*beat export template

When using
*beat export index-pattern and then trying to use the Kibana API
/api/saved_objects/_import
to import, this is failing.

I am using 7.5.0 release for all Elastic components, running on Linux. Beats will be running on windows hosts. I had already thought of the option of loading a beat onto one of the linux hosts with Elastic components on it, but it doesnt help me with winlogbeat which only runs on windows.

There needs to be a manual way to install index-patterns and dashboards.
I even tried loading the exported index pattern through the Kibana UI and got the errors:

Support for JSON files is going away
Use our updated export to generate NDJSON files, and you'll be all set.
Sorry, there was an error
Saved objects file format is invalid and cannot be imported.

What am I doing wrong?

Thanks,

John

Ok I think I have found myself a solution.

The clues was to compare the output of an export from kibana and export from a beat.
The beat export format wraps the items in an objects array.

{
  "objects": [
     { ... }
],
  "version": ""
}

Also, the output is in a pretty format - this needs to be converted to ndjson. (the API complains if the file isn't ndjson).

I tried a filebeat index pattern

jq -c '.objects[0]' 7.5.0/filebeat/filebeat_7.5.0_index-pattern.json > filebeat-idx.ndjson

  • -c produces compact output suitable for ndjson.
  • '.objects[0]' selects the first object in the objects array - in this case there is only one object.

using the curl command
curl -XPOST "https://localhost:5601/api/saved_objects/_import" -H "kbn-xsrf: true" --form file=@filebeat-idx.ndjson
{"success":true,"successCount":1}

Now to examine the dashboards.

I have got a working solution to manually load the beats Elasticsearch and Kibana assets using salt.

On a windows host, I prepare a tarfile with the items required (This is because I am using winlogbeat and there is only a build for windows available).

I export items [template, ilm-policy, index-patterns] with the command:
export
I also copy the folder kibana/x/dashboard

The template loads without any modifications into Elasticsearch.

For Kibana items [index-pattern, dashboard]:
for each item, convert the json file into ndjson format.
For this task I used jq
jq -c '.objects[]' in-file.json > out-file.ndjson

The query removes extra json elements wrapped around the array of items.
The -c flag formats as compact which is what is needed for ndjson.

You are then able to use the API with the curl command to upload the ndjson files:
curl -XPOST "https://localhost:5601/api/saved_objects/_import" -H "kbn-xsrf: true" --form file=@out-file.ndjson
{"success":true,"successCount":x}

I compared this method against setting up kibana with beats and the results are identical.

Just posting for anyone else stumbling into this problem.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.