Csv export in discovery of very long fields is failing

Hi,

We are using kibana to consolidate log files and to merge them into a stream from multiple application nodes. In this usecase we don't need any aggregation, but we want to search for error messages and we would like to export the errors like request body, response body, stack trace, etc. which are fields of the event. Unfortunately these fields can become very long.

I noticed that the export in discover field is failing, if the fields are too long. If I select smaller fields, the export is working.

Next week I can reproduce the error and check if there are any more specific error messages in logs, but maybe you have a hint, if I just need to tweeks some parameters in kibana or elasticsearch.

And also one additional question: is there a way to export in discover module without saving the query first?

stack version is 6.5.4

Thanks, Andreas

Hi Andreas,

When Kibana generates the CSV, it stores it in Elasticsearch for later download. Elasticsearch has a max length of bytes it will allow in a data upload, and I think your failures might be happening because the generated CSV data is too large for Elasticsearch.

See the http.max_content_length setting: https://www.elastic.co/guide/en/elasticsearch/reference/6.5/modules-http.html. The default is 100mb.

There are a lot of improvements in newer versions of Kibana on logging errors like this.

That is coming hopefully very soon! Work is in progress and expected to come in 7.x.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.