How to export csv in kibana 7.5 with more then 1 million row

I need help... i have problem with reporting CSV on kibana. i want to get CSV with 1.000.000 row from discover but its always failed. i get notif that [Reporting generate CSV error:Unable to generate report Max attempts reached (3)]? what must i do to export CSV in large volume?

Depending on why your report is failing there are a few settings you can tweak in your kibana.yml:

  • xpack.reporting.csv.maxSizeBytes: by default this is set to 1024 * 1024 * 10 = 10MB, you can increase if your export will be larger
  • xpack.reporting.csv.scroll.duration, default 30s, increase if you see errors in the log about the scroll not being found or expiring

If these don't work, try looking at the logs for information about why the report is failing. It's possible you have a scripted field that fails on some documents. If so, fix the scripted field to handle the data in the document that fails, or remove it.

i have setting the xpack.reporting.csv.maxSizeBytes but now i get another error when reporting CSV. i get notif [Error: Failed to decrypt report job data. Please ensure that xpack.reporting.encryptionKey is set and re-generate this report. Error: Unsupported state or unable to authenticate data]. must i set xpack.reporting.encryptionKey?

Yes it is required for stable usage across restarts. You should see a warning when starting Kibana without it.

I had same type of problem maxsize reach as report was big. I got around by doing it via logstash
output csv was about 500mb.

input {
   elasticsearch {
      hosts => ["host1:9200", "host2:9200"]
      index => "index-2020"
      user =>   "elastic"
      password => "xxxx"
   }
}

filter {
mutate { remove_field => ["field1","field2","@timestamp","@version"] }
}


output {
   csv {
      fields => [ "myfield1","myfield2","myfield3" ]
      csv_options => { "col_sep" => "," }
      path => "/data01/csv_files/test.csv"
    }
  stdout { codec => rubydebug }
}

I had did a config on xpack.reporting.encryptionKey, but its still error with error message [Max attempts reached (3)]. in the log i have massage

{"type":"log","@timestamp":"2020-01-17T07:05:37Z","tags":["reporting","warning"],"pid":6533,"message":"xpack.reporting.csv.maxSizeBytes (1309355388) is higher than ElasticSearch's http.max_content_length (104857600). Please set http.max_content_length in ElasticSearch to match, or lower your xpack.reporting.csv.maxSizeBytes in Kibana to avoid this warning."}.

and

{"type":"log","@timestamp":"2020-01-17T07:05:21Z","tags":["reporting","csv","execute-job","xxx","debug"],"pid":824,"message":"finished generating, total size in bytes: 188286137"}.

how much i must set http.max_content_length and xpack.reporting.csv.maxSizeBytes ? how much maximum size they can be?

@joelgriffith any ideas here on how to get more information about why this is failing?