Csv reporting always failed win 5m records

Hi,

I was trying to share CSV with 5 million records which is only datatime and md5 . so size might be less even its 5 million records .
I have 900 MB in limit in both
xpack.reporting.csv.maxSizeBytes
http.max_content_length

Hi @Hari_Krishnan,

How does it fail exactly? What do you see in the Kibana logs (you'd need to enable debug logs with logging.verbose: true)? How does your kibana.yml look like and what version of the stack you use?

Best,
Oleg

logging.verbose: true in kibana.yml file
Kibana version: 7.0.1
ES version: 7.0.1

Kibana.yml
xpack.reporting.csv.maxSizeBytes: 905890000
xpack.reporting.queue.timeout: 120000
logging.verbose: true

elasticsearch.yml
http.max_content_length: 1000mb

/var/log/message output :slight_smile:
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}
tags":["reporting","csv","debug"],"pid":3703,"message":"executing scroll request"}

Well, that's not an error and it's expected. The Scroll API is used to work with large data set. So it should eventually do the job, unless you have other errors in the logs?

This is last log output if i search using csv keyword

{"type":"log","@timestamp":"2020-06-29T11:27:50Z","tags":["reporting","csv","debug"],"pid":3703,"message":"executing clearScroll request"}
{"type":"log","@timestamp":"2020-06-29T11:27:50Z","tags":["reporting","csv","debug"],"pid":3703,"message":"finished generating, total size in bytes: 13103907"}

Here i assume the total report size 13103907 (~13mb) and i have configuration more than that

["reporting","csv","debug"],"pid":3703,"message":"finished generating, total size in bytes: 13103907"}
Here i assume the total report size 13103907 (~13mb) and i have configuration more than that

But this log entry says that it's successfully completed, what doesn't work for you then? Also do you see any log entries with "error" tag at all?

in the UI it shows failed after 3 attempts, is there any location i can check for csv file ?

I see, then there definitely should be some failure in the logs (with logging.verbose: true). Either try to find it in the full logs or share that here (please, remove any sensitive info you may have there).

Attaching log

https://pastebin.com/XtQF89bv

Any Luck @azasypkin ?