KIbana get killed automatically after csv export

Hi Guys I'm struggling with this problem since 2 weeks now everytime when I download a csv file with 1GB the kibana get killed I don't know why
this the error that I get

log   [12:18:40.381] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] executing scroll request
  log   [12:18:40.577] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] executing scroll request
  log   [12:18:40.765] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] executing scroll request
  log   [12:18:40.961] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] executing scroll request
  log   [12:18:41.082] [warning][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] max Size Reached
  log   [12:18:41.094] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] executing clearScroll request
  log   [12:18:41.105] [debug][csv][execute-job][ki089pr10x8ld0ca1c2hupiq][reporting] finished generating, total size in bytes: 1048575667
  log   [12:18:59.181] [info][esqueue][queue-worker][reporting] ki087cv90x8ld0ca1c2da3ba - Job execution completed successfully
Killed

this my kibana.yml

xpack.security.encryptionKey : "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
elasticsearch.requestTimeout: 900000
xpack.reporting.queue.timeout: 120000000
xpack.reporting.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
xpack.reporting.csv.maxSizeBytes: 1000000000 
xpack.encryptedSavedObjects.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
csp.strict: true
xpack.reporting.enabled: true
logging.verbose: true
server.maxPayloadBytes: 10485760  

I will appreciate any help

@elasticsearch

You also need to increase the server.maxpayloadbytes. But using csv's this large is really not recommended.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.