Hi Guys I'm facing a problem with the exports in Kibana since a week now and I will be happy, if I can get some help here.
I want to generate big csv reports from kibana that could reach 3 Gb
And I was surprised that every time when I want to export more than 300mb file from Kibana Elasticsearch get killed automatically.
I run KIbana as follow
NODE_OPTIONS="--max-old-space-size=4096" ./bin/kibana
and ELS as follow:
bin/elasticsearch
my configuration in Kibana.yml is
xpack.security.encryptionKey : "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
elasticsearch.requestTimeout: 900000
xpack.reporting.queue.timeout: 120000000
xpack.reporting.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
xpack.reporting.csv.maxSizeBytes: 400000000
xpack.encryptedSavedObjects.encryptionKey: "EePXCifCaykPORwjqKGbINHCPFYnNXZT"
csp.strict: true
xpack.reporting.enabled: true
logging.verbose: true
server.maxPayloadBytes: 10485760
and in elasticsearch.yml is:
indices.breaker.total.use_real_memory : false
http.max_content_length: 500mb
and my jvm.options config is as follow:
-Xms4g
-Xmx4g
so when I try to download a reporting from KIbana which will have a 400mb as a size based on my config Kibana get automatically killed
log [16:58:23.585] [warning][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] max Size Reached
log [16:58:23.586] [debug][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] executing clearScroll request
log [16:58:23.588] [debug][csv][execute-job][khz2to7r0wtld0ca1c33bvgg][reporting] finished generating, total size in bytes: 1048575667
log [16:58:31.248] [info][esqueue][queue-worker][reporting] khz2q2s00wtld0ca1c77dr67 - Job execution completed successfully
Killed
So I wanna ask if I'm doing something wrong ?
I will appreciate every help from you guys!