Maximum size of reports?


I seem to be having an issue while generating reports for large amounts of data. I wanted to confirm if I'm hitting some kind of max size or if this is some kind of configuration issue. Currently, I am trying to pull a day's worth logs off of one of our services for analysis but I can only generate 20MB, or about 15 seconds, worth of data. While this will be a large dataset, I'm wondering if I'm hitting a hard cap on how large the export can be. Even after increasing the timeout time and the maxcsv size, I still cannout generate a report. Below I have included the error I am seeing in our kibana.log file:

{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","esqueue","worker","debug"],"pid":18707,"message":"jd0lg6ox0efn7d14ac7atpm2 - Failure occurred on job jd0m5q9p0efn7d14ac3iy2o0: Error: "toString()" failed\n at Buffer.toString (buffer.js:495:11)\n at MaxSizeStringBuilder.getString (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/max_size_string_builder.js:20:46)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:51:24\n at next (native)\n at step (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:191)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:361"}
{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","worker","debug"],"pid":18707,"message":"CSV: Worker error: (jd0m5q9p0efn7d14ac3iy2o0)"}

Within the max_size_string_builder.js it's calling Buffer.alloc(maxSizeBytes) to determine the max size. Would this suggest that the max size of a file is 1GB?

If this is the case, is there another way to dump large datasets?

Thank you!


I think we need to consider two things here:

  1. From Kibana's side - there is a default size limit. You can try increasing it?
The maximum size of a CSV file before being truncated. This setting exists to prevent large exports from causing performance and storage issues. Defaults to 10485760 (10mB)
  1. From elasticsearch's side, the file size is 100 mb per default.

Let me know if this helps.


Thanks for the quick reply! I have increased the default size in Kibana however, it was greater than 100MB which would explain why it would error out. I'm going to test with the 100MB report and see if it generates successfully.

So a follow-up question, is there a way to create large data dumps?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.