Hi,
I seem to be having an issue while generating reports for large amounts of data for export on CSV. I wanted to confirm if I'm hitting some kind of max size or if this is some kind of configuration issue. Currently, I am trying to pull a twitter users database for analysis on graph data, but I can only generate 20MB. While this will be a large dataset, I'm wondering if I'm hitting a hard cap on how large the export can be. Even after increasing the timeout time on kibana.yml, the maxcsv size and the http.max_content_length on elastichsearch.yml, I still cannout generate a report. Below I have included the error I am seeing in our kibana.log file:
Blockquote
{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","esqueue","worker","debug"],"pid":18707,"message":"jd0lg6ox0efn7d14ac7atpm2 - Failure occurred on job jd0m5q9p0efn7d14ac3iy2o0: Error: "toString()" failed\n at Buffer.toString (buffer.js:495:11)\n at MaxSizeStringBuilder.getString (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/max_size_string_builder.js:20:46)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:51:24\n at next (native)\n at step (/usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:191)\n at /usr/share/kibana/plugins/x-pack/plugins/reporting/export_types/csv/server/lib/generate_csv.js:20:361"}
{"type":"log","@timestamp":"2018-01-29T19:37:16Z","tags":["reporting","worker","debug"],"pid":18707,"message":"CSV: Worker error: (jd0m5q9p0efn7d14ac3iy2o0)"}
Blockquote
Is there a way to dump large datasets (~ 2GB)?
Thank you!