Hello,
i already increased Java Heap sizes for graylog-server which is 4.3.13 and elasticsearch . According to the nodes overview, my settings for graylog of 8/12 gb are not filled. Elasticsearch can do searches (sometimes ) and usually has at least 20g on any of the 4 nodes. Nevertheless my Logfiles gets flooded with those errors, and i dont know where i would be able to rise the 4.1 Gb limit ?
thanks for hints
[parent] Data too large, data for [<http_request>] would be [4449250311/4.1gb], which is larger than the limit of [4448655769/4.1gb], usages [request=0/0b, fielddata=1012822694/965.9mb, in_flight_requests=594083/580.1kb, accounting=3435833534/3.1gb], errorDetails=[[parent] Data too large, data for [<http_request>] would be [4449250311/4.1gb], which is larger than the limit of [4448655769/4.1gb], usages [request=0/0b, fielddata=1012822694/965.9mb, in_flight_requests=594083/580.1kb, accounting=3435833534/3.1gb]]}