On kibana when i try to search or filter on a index that contains documents that has field very large ( it stores large text file of 5000 lines or more) I got these error but i dont see any error in logs file kibana or elasticsearch
The problem might be another timeout caused by sending a lot of data over the network, or caused by parsing a large amount of data.
You can use the Source Filters in Management > Index Patterns to prevent that field from being included in queries as you interact with Kibana. You just pick the index pattern (or patterns) you're using with that field and add it to the list. I believe it just adds it as an excludes in the query, but I'm not totally sure. The important part is that you won't be sending or parsing those documents in the request anymore.
But that field is the one i want to explore in kibana. Is there anyway to increase this kibana timeout to handle large documents or that kibana dont try to load all data from documents at same time and only when clicking on one?
One way to load less data under Discover would be to change the value of discover:sampleSize under Management - Advanced settings. This is however a global setting.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.