Kibana Discover: Request Timeout after 30000ms when searching large documents

On kibana when i try to search or filter on a index that contains documents that has field very large ( it stores large text file of 5000 lines or more) I got these error but i dont see any error in logs file kibana or elasticsearch

Discover: Request Timeout after 30000ms

Less Info
Error: Request Timeout after 30000ms
at http://192.xxxx:5601/bundles/kibana.bundle.js?v=15382:13:4431

heap size was increased to 12gb

is there anyway to improve and be able to manage and search indexes that contains large text documents ?

I tried to change elasticsearch.requestTimeout: 900000 and restart but it doesnt work error still says 30000ms

Also if i do the query on visualize table that doesnt show the large field it works.

The problem might be another timeout caused by sending a lot of data over the network, or caused by parsing a large amount of data.

You can use the Source Filters in Management > Index Patterns to prevent that field from being included in queries as you interact with Kibana. You just pick the index pattern (or patterns) you're using with that field and add it to the list. I believe it just adds it as an excludes in the query, but I'm not totally sure. The important part is that you won't be sending or parsing those documents in the request anymore.

But that field is the one i want to explore in kibana. Is there anyway to increase this kibana timeout to handle large documents or that kibana dont try to load all data from documents at same time and only when clicking on one?

One way to load less data under Discover would be to change the value of discover:sampleSize under Management - Advanced settings. This is however a global setting.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.