Hello, I need to generate large CSV reports so I must set 'xpack.reporting.csv.maxSizeBytes' to a high value but I'm being capped at 50MB, how can I go beyond that?
The error when trying to save the changes is:
Kibana - 'xpack.reporting.csv.maxSizeBytes': has to be a number between [0, 52428800] inclusive
This could be a restriction that isn't necessarily needed in newer versions, or just a bug in the application. You can notify the support team of this issue at https://cloud.elastic.co/support
Thank you @tsullivan I think the same after the change with the chunked requests. Also, Elasticsearch body size can not be changed in elastic cloud. Do you think the default value would accept a high csv export size?
Which version are you using? The dev team worked on an optimization change to save CSV export data into Elasticsearch by partitioning chunks of the CSV data across multiple ES documents in v7.15. Is that what you are referring to with "the change with the chunked requests" ?
In that optimization, Kibana detects the max body size that ES will accept, and uses that as the size for each chunk in the partitioned data.
Sorry to hear that. I do just want to emphasize that the only way to see improvement in this area will be to file an issue in https://cloud.elastic.co/support.
We made a custom request and the limit was increased by Elastic Cloud Support. We started at 500MB and we could get 200K rows report in 4min so it is a good start. Kibana node has 4GB. Indeed the cap removal was approved by you guys, so hopefully we have it removed in future versions.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.