Csv export in discovery of very long fields is failing


#1

Hi,

We are using kibana to consolidate log files and to merge them into a stream from multiple application nodes. In this usecase we don't need any aggregation, but we want to search for error messages and we would like to export the errors like request body, response body, stack trace, etc. which are fields of the event. Unfortunately these fields can become very long.

I noticed that the export in discover field is failing, if the fields are too long. If I select smaller fields, the export is working.

Next week I can reproduce the error and check if there are any more specific error messages in logs, but maybe you have a hint, if I just need to tweeks some parameters in kibana or elasticsearch.

And also one additional question: is there a way to export in discover module without saving the query first?

stack version is 6.5.4

Thanks, Andreas


(Tim Sullivan) #2

Hi Andreas,

When Kibana generates the CSV, it stores it in Elasticsearch for later download. Elasticsearch has a max length of bytes it will allow in a data upload, and I think your failures might be happening because the generated CSV data is too large for Elasticsearch.

See the http.max_content_length setting: https://www.elastic.co/guide/en/elasticsearch/reference/6.5/modules-http.html. The default is 100mb.

There are a lot of improvements in newer versions of Kibana on logging errors like this.


(Tim Sullivan) #3

That is coming hopefully very soon! Work is in progress and expected to come in 7.x.