Csv file export with 1 million records(approx 1 gb size) in kibana reporting

Hi All,
I am trying to export csv file with 1 million records(approx 1 gb size) from kibana reporting.
I had tried different combinations of below configs but getting different errors.Also increased kibana heap size to 4 gb.
Is it practially possible to export such large file

kibana.yml :
xpack.monitoring.enabled: true
server.maxPayloadBytes: 10485760
xpack.reporting.queue.timeout: 3600000
xpack.reporting.csv.maxSizeBytes: 104857600

Error recieved in Kibana reporting tab:
Unable to generate report
Max attempts reached (3)

Unable to generate report
Request Entity Too Large

@joelgriffith is this something which you can shed some light on ? Only thing which came to my mind was to increase the limit with the server.maxPayloadBytes setting, but are there more settings to try ?

Also the Kibana logs, screenshots would help
Thanks
Rashmi

1 Like

@joelgriffith and ELK team,

Please suggest on this...

Yes: you'll need to update your ElasticSearch cluster to support the large file-size: http.max_content_length should be set to the same size as Reportings size (https://www.elastic.co/guide/en/kibana/6.8/reporting-settings-kb.html#reporting-csv-settings).

I'll note that a 1GB/1M records is outside the bounds of a normal CSV export, and I'm not sure how ElasticSearch/Kibana will handle that since it's a large file-size.

1 Like

Hey Sarvendras,

I had encounter a similar road block and I had figured out a workaround with some help on me end.

If you are utilizing Logstash, you can use the "elasticsearch" input with the "query" option and have the output use the "csv" plugin.

Make sure you have permissions to write where ever you have the path pointed to.

I hope this helps, its not a a real solution, but I believe it'll help you get your reports.

(https://www.elastic.co/guide/en/logstash/6.8/plugins-inputs-elasticsearch.html)
(https://www.elastic.co/guide/en/logstash/6.8/plugins-outputs-csv.html)

input {
   elasticsearch {
      hosts => ["host1:9200", "host2:9200"]
      index => "index-2020"
      user =>   "elastic"
      password => "xxxx"
   }
}

filter {
mutate { remove_field => ["field1","field2","@timestamp","@version"] }
}


output {
   csv {
      fields => [ "myfield1","myfield2","myfield3" ]
      csv_options => { "col_sep" => "," }
      path => "/data01/csv_files/test.csv"
    }
  stdout { codec => rubydebug }
}

Pretty much same what previous use said

https://discuss.elastic.co/t/how-to-export-csv-in-kibana-7-5-with-more-then-1-million-row/214526/4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.