Does Kibana supports a big csv exports?

Hi Guys I'm facing a problem with the exports in Kibana since a week now and I will be happy, if I can get some help here.
I want to generate big csv reports from kibana that could reach 3 Gb
And I was surprised that every time when I want to export more than 300mb file from Kibana Elasticsearch get killed automatically.

I run KIbana as follow

NODE_OPTIONS="--max-old-space-size=4096" ./bin/kibana 

and ELS as follow:

bin/elasticsearch

my configuration in Kibana.yml is

xpack.security.encryptionKey : "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
elasticsearch.requestTimeout: 900000
xpack.reporting.queue.timeout: 120000000
xpack.reporting.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
xpack.reporting.csv.maxSizeBytes: 400000000 
xpack.encryptedSavedObjects.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
csp.strict: true
xpack.reporting.enabled: true
logging.verbose: true
server.maxPayloadBytes: 10485760  

and in elasticsearch.yml is:

indices.breaker.total.use_real_memory : false 
http.max_content_length: 500mb

and my jvm.options config is as follow:

-Xms4g
-Xmx4g

so when I try to download a reporting from KIbana which will have a 400mb as a size based on my config Kibana get automatically killed

log   [16:58:23.585] [warning][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] max Size Reached
  log   [16:58:23.586] [debug][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] executing clearScroll request
  log   [16:58:23.588] [debug][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] finished generating, total size in bytes: 1048575667
  log   [16:58:31.248] [info][esqueue][queue-worker][reporting] khz2q2s00wtld0ca1c77dr67 - Job execution completed successfully
Killed

So I wanna ask if I'm doing something wrong ?
I will appreciate every help from you guys!

There's some previous discussion on this topic, here: Csv file export with 1 million records(approx 1 gb size) in kibana reporting

I don't know if you're doing anything wrong, but I believe that is pushing the limits of what Kibana can do.

1 Like

But I'm not using logstasch to download the csv, I downloading it from the browser(KIbana localhost).
I'm just asking if it is possible to download a csv file bigger than 3Gb or not ?
if yes what should the configs look like in elasticsearch.yml , kibana.yml and jvm.options
Thx

@Elastic Team Member I will appreciate any help !! :frowning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.