Hi,
I seem to be having an issue while generating reports for large amounts of data. I wanted to confirm if I'm hitting some kind of max size or if this is some kind of configuration issue. Currently, I am trying to pull a day's worth logs off of one of our services for analysis but I can only generate 20MB, or about 15 seconds, worth of data. While this will be a large dataset, I'm wondering if I'm hitting a hard cap on how large the export can be. Even after increasing the timeout time and the maxcsv size, I still cannout generate a report. Below I have included the error I am seeing in our kibana.log file:
Blockquote
{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","esqueue","worker","debug"],"pid":18707,"message":"jd0lg6ox0efn7d14ac7atpm2 - Failure occurred on job jd0m5q9p0efn7d14ac3iy2o0: Error: "toString()" failed\n at Buffer.toString (buffer.js:495:11)\n at MaxSizeStringBuilder.getString (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/max_size_string_builder.js:20:46)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:51:24\n at next (native)\n at step (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:191)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:361"}
{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","worker","debug"],"pid":18707,"message":"CSV: Worker error: (jd0m5q9p0efn7d14ac3iy2o0)"}
Blockquote
Within the max_size_string_builder.js it's calling Buffer.alloc(maxSizeBytes) to determine the max size. Would this suggest that the max size of a file is 1GB?
If this is the case, is there another way to dump large datasets?
Thank you!