CSV Export fails after size exceeds threshold


#1

Hi all,

We're running an Elasticsearch/Kibana installation v. 6.2.3, with x-pack basic license activated. CSV exports are executed successfully up to a certain file size (~800KB). If the columns or rows are increased in the search to export, the export fails after 3 attempts. We suspected this was due to timeouts so we increased the timeout and file size settings in kibana.yml but the issue persists. Current settings available below:

xpack.reporting.queue.pollInterval: 60000
xpack.reporting.queue.timeout: 240000
xpack.reporting.csv.maxSizeBytes: 52428800

Additional plugins running:

  • own_home
  • searchguard

The successful exports are ready in just a few seconds. We cannot find anything peculiar in the kibana logs, searching for csv we can view the requests at the reporting api.

Any ideas are more than welcome.


#2

Following some further investigation, we ended up with the following:

  • the issue was related to own_home, which uses hapijs to load a proxy server between Kibana and Elasticsearch
  • hapijs by default restricts payloads up to 1MB, so exports larger than that were doomed to fail
  • the solution was to change init_proxy.js in own_home to allow for larger payloads.

This thread can be considered closed.


(kulkarni) #3

Great , Thanks for posting the conclusion you arrived at.

Cheers
Rashmi


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.