Export Over 4M Data to CSV File on Self-Hosted Cluster

Hello Everyone,

I have an issue with Reporting. I need to create a CSV file for our customer and they want to see on file over 4 million records.

I read the document and it says I may set max. bytes with xpack.reporting.csv.maxSizeBytes option on kibana.yml
I tested for approx. 1 GB data but It always failed.

So, after the 7.x version, is this option really just enable for Cloud products?

If no, do you guys know how may I export 4.6 million data to a CSV file?

Platform: Self-Hosted
License: Basic
Elasticsearch & Kibana Version: 7.7.0



Do you have a LogStash instance? You could use an elasticsearch input plugin to read the data you need and use a CSV output plugin to write the file locally.

Best regards

Welcome to our community! :smiley:

Failed how? What do your logs show?

Oh! Thank you @warkolm I glad to see you :slight_smile:

I got this error:

Unable to generate report Request Entity Too Large

And @Wolfram_Haussig

Thank you for your reply, I will try to use Logstash. Yes, we have Logstash nodes.


Another approach you could take would be to write a quick script to do a search that matches all the records you want, say 1,000 records at a time, then paginate through the results and put them into a CSV file.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.