Data to large

Hi,

I have Elasticsearch cluster version 5.1.2 on windows 2012 r2 server.

One of the servers in the cluster having an issue that I don't know how to solve... (i'm new at the elastic)

In the log file it seems like that I have a request that is too large.

path:"Request Path" params: {index=Index name, type=ssoelasticlogger}
org.Elasticsearch.common.breaker.CircuitBreakingException: [parent] Data too large, data for [<http_request>] would be larger than limit of [23960249958/22.3gb]

The JVM memory limit is set to 32GB

The server have 128GB of memory.

Thank you for you're help

That is a very, very old version. I would recommend that you upgrade as a lot of improvements have been made in the last few years.

It is generally important that the heap is set to below 32GB so that compressed pointers can be used. You should be able to see this in the logs at startup. At least you do in more recent versions.

To get a better understanding of the cluster, can you provide the full output of the cluster stats API?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.