I am using a data engine to send data about web traffic to my ELK stack (version 5.2). When I try to visualize the data in my dashboard I sometimes get the following error:
Error Visualize: [request] Data too large, data for [agg [1]] would be larger than limit of [311387750/296.9mb]
Error: Request to Elasticsearch failed: {"error":{"root_cause":[{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [1]>] would be larger than limit of [311387750/296.9mb]","bytes_wanted":311392312,"bytes_limit":311387750}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"logstash-2017.06.22","node":"iRl6kRKrTSGWqxFYI9i0rQ","reason":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [1]>] would be larger than limit of [311387750/296.9mb]","bytes_wanted":311392312,"bytes_limit":311387750}}],"caused_by":{"type":"circuit_breaking_exception","reason":"[request] Data too large, data for [<agg [1]>] would be larger than limit of [311387750/296.9mb]","bytes_wanted":311392312,"bytes_limit":311387750}},"status":503}
at http://192.168.1.91:5601/bundles/kibana.bundle.js?v=14723:27:18931
at Function.Promise.try (http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:75:22354)
at http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:75:21724
at Array.map (native)
at Function.Promise.map (http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:75:21679)
at callResponseHandlers (http://192.168.1.91:5601/bundles/kibana.bundle.js?v=14723:27:18543)
at http://192.168.1.91:5601/bundles/kibana.bundle.js?v=14723:27:7044
at processQueue (http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:38:23621)
at http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:38:23888
at Scope.$eval (http://192.168.1.91:5601/bundles/commons.bundle.js?v=14723:39:4619)
This error also happens with [agg [6]] and [agg [9]].
I think this error is caused by a data table I am using in my dashboard. Here is the elasticsearch query I am using in the data table:
{
"query": {
"bool": {
"must": [
{
"query_string": {
"analyze_wildcard": true,
"query": "*"
}
},
{
"query_string": {
"analyze_wildcard": true,
"query": "*"
}
},
{
"range": {
"timestamp": {
"gte": 1498154367415,
"lte": 1498155267415,
"format": "epoch_millis"
}
}
}
],
"must_not": []
}
},
"size": 0,
"_source": {
"excludes": []
},
"aggs": {
"2": {
"date_histogram": {
"field": "timestamp",
"interval": "30s",
"time_zone": "America/Los_Angeles",
"min_doc_count": 1
},
"aggs": {
"3": {
"terms": {
"field": "appid_name.keyword",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
},
"5": {
"terms": {
"field": "srcip",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
},
"6": {
"terms": {
"field": "dstip",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
},
"7": {
"terms": {
"field": "srcport",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
},
"8": {
"terms": {
"field": "dstport",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
},
"9": {
"terms": {
"field": "proto",
"size": 5,
"order": {
"1": "desc"
}
},
"aggs": {
"1": {
"sum": {
"script": {
"inline": "doc['fwdbytes'].value+doc['bwdbytes'].value",
"lang": "painless"
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
I also sometimes get this warning: "Courier Fetch: 4 of 5 shards failed." (sometimes 2 or 3 shards instead) Are these two related? How can I fix this?