I'm getting an error while using a dashboard. Not able to figure out what is going wrong.
Error: Request to Elasticsearch failed: {"error":{"root_cause":[{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<reused_arrays>] would be [1267582136/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267582136,"bytes_limit":1267571097},{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [1]>] would be [1267574424/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267574424,"bytes_limit":1267571097},{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<reused_arrays>] would be [1267585744/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267585744,"bytes_limit":1267571097},{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [3]>] would be [1267574424/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267574424,"bytes_limit":1267571097}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"va","node":"KuXT-BxhQ7iSdONqNJZPtw","reason":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<reused_arrays>] would be [1267582136/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267582136,"bytes_limit":1267571097}},{"shard":1,"index":"va","node":"KuXT-BxhQ7iSdONqNJZPtw","reason":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [1]>] would be [1267574424/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267574424,"bytes_limit":1267571097}},{"shard":2,"index":"va","node":"KuXT-BxhQ7iSdONqNJZPtw","reason":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<reused_arrays>] would be [1267585744/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267585744,"bytes_limit":1267571097}},{"shard":4,"index":"va","node":"KuXT-BxhQ7iSdONqNJZPtw","reason":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [3]>] would be [1267574424/1.1gb], which is larger than the limit of [1267571097/1.1gb]","bytes_wanted":1267574424,"bytes_limit":1267571097}}]},"status":503}
at http://x.x.x.x/bundles/kibana.bundle.js?v=15104:28:10760
at Function.Promise.try (http://x.x.x.x/bundles/commons.bundle.js?v=15104:82:22203)
at http://x.x.x.x/bundles/commons.bundle.js?v=15104:82:21573
at Array.map (native)
at Function.Promise.map (http://x.x.x.x/bundles/commons.bundle.js?v=15104:82:21528)
at callResponseHandlers (http://x.x.x.x/bundles/kibana.bundle.js?v=15104:28:10376)
at http://x.x.x.x/bundles/kibana.bundle.js?v=15104:27:29944
at processQueue (http://x.x.x.x/bundles/commons.bundle.js?v=15104:38:23621)
at http://x.x.x.x/bundles/commons.bundle.js?v=15104:38:23888
at Scope.$eval (http://x.x.x.x/bundles/commons.bundle.js?v=15104:39:4619)
The elasticsearch circuit breaker prevents running queries that would take down elasticsearch. Selecting less data or increasing your elasticsearch capacity should fix this.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.