CSV Reporting Truncating Results in 7.15.x

Using Elastic Kibana 7.13.3 (and previous versions) I was able to Save and Share/CSV reports with a maximum size of around 500 MBs (We have a Security requirement to archive off logs from Windows (WinLogBeat), Linux (FileBeat), ESXi and Elastiflow (Switches)).

However, with Elastic Kibana 7.15.1 using the same configuration settings, the reports that were around 500 MBs in size are now being truncated to 10% of the actual size/number of rows.
Has anything changed with respect to configuring this to work or is there a bug in the Reporting with 7.15.x?

Here are the settings that allowed the larger Reporting

Elasticsearch.yml
http.max_content_length: 2047mb

kibana.yml
server.maxPayloadBytes: 104857600
xpack.reporting.enabled: true
xpack.reporting.queue.timeout: 3600000
xpack.reporting.csv.maxSizeBytes: 1258291200

node.options
--max-old-space-size=16384

are you sure that is OK? that is about 104MB

I modified the server.maxPayloadBytes to the same as the xpack.reporting.csv.maxSizeBytes of 1258291200 ... and I'm still getting a truncation of the csv file. Normally for the ESXi logs I can save and do a csv file of around 1,650, 000 hits/rows. I tried to run one after this change and I only get 104,303 rows out of 760,728 hits from the dashboard. The number of rows would be the same if it was 200,000 or 1,650,000 rows.

So, this is sort of a big deal not being able to save off a larger CSV report. If it isn't fixed I have only two solutions, one is to roll back to my data as of 7.13.3 (luckily I saved that before I upgraded to 7.15.0 initially ... but I would also lose about two weeks of logs) or somehow do an automation of the weekly reporting to small enough chunks (100K rows) to make it work.

I haven't tried the automating of CSV reports, but is it possible to setup an automation that saves off a CSV report for every two hours for an entire week, giving the start date and time (always up for a challenge)?

This seems like the exact same issue I ran into.
Pre 7.15.x a semi large CSV report worked without any issues.
After upgrading to 7.15.2 the same CSV report would halt the Kibana instance. I tried to verticaly scale the Kibana instance (increasing CPU and Memory resources) but the Kibana instance still became unresponsive and never completed the report.

Increasing the following kibana.yml configurations:

xpack.task_manager.max_workers: 100
xpack.reporting.queue.timeout: 600000 #Was configured prior 7.15.X upgrade
xpack.reporting.csv.maxSizeBytes: 104857600 #Was configured prior 7.15.X upgrade
xpack.reporting.csv.scroll.size: 10000

I Managed to get the CSV reporting feature to complete, but it returns a truncated report (expected ~500.000 rows, but got ~60.000).

Perhaps this is related to [Reporting] Streaming CSV download appears to not be working · Issue #119540 · elastic/kibana · GitHub, as I am still able to generate small CSV reports.

I believe you ran into this issue. This issue is somewhat fixed in Kibana 7.16.1, see Kibana 7.16.1 | Kibana Guide [7.16] | Elastic under Reporting.

In my opinion the CSV exports are still slow compared to pre 7.15, 100 MB taking ~20 min. Another issue with this is if you are you SAML authentication you will get an token expire because of this Reporting should work with token based authentication · Issue #71866 · elastic/kibana · GitHub. This can be mitigated by increasing xpack.security.authc.token.timeout: 1h from 20m defualt. However, it does not solve the issue. I am not sure if local system accounts are affected by authentication token.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.