I occasionally get the following error in Kibana from elasticsearch
*1. *
Oops!ElasticsearchException[org.elasticsearch.common.breaker.CircuitBreakingException:Data
too large, data for field [@timestamp] would be larger than limit
of[639015321/609.4mb]]
I even get this when searching for <5 min of data through the Kibana log
stash search. It will disappear and reappear for the reason apparent to me.
My total data size is not that big yet.
Elastic search has been left on the default of splitting the indices by
day. The current size in 168M for today.
in order for kibana to work, elasticsearch needs to load all @timestamp
values into memory. This exception tries to tell the user, that loading
this into memory would result in an Out-of-Memory exception, so
elasticsearch aborts this request. You could start elasticsearch with more
memory and see if it works.
I occasionally get the following error in Kibana from elasticsearch
*1. *
Oops!ElasticsearchException[org.elasticsearch.common.breaker.CircuitBreakingException:Data
too large, data for field [@timestamp] would be larger than limit
of[639015321/609.4mb]]
I even get this when searching for <5 min of data through the Kibana log
stash search. It will disappear and reappear for the reason apparent to me.
My total data size is not that big yet.
Elastic search has been left on the default of splitting the indices by
day. The current size in 168M for today.
Did you figure this out? I'm running into the same problem.
On Thursday, July 31, 2014 at 3:22:28 AM UTC-6, Rhys Campbell wrote:
I occasionally get the following error in Kibana from elasticsearch
*1. *
Oops!ElasticsearchException[org.elasticsearch.common.breaker.CircuitBreakingException:Data
too large, data for field [@timestamp] would be larger than limit
of[639015321/609.4mb]]
I even get this when searching for <5 min of data through the Kibana log
stash search. It will disappear and reappear for the reason apparent to me.
My total data size is not that big yet.
Elastic search has been left on the default of splitting the indices by
day. The current size in 168M for today.
Did you figure this out? I'm running into the same problem.
On Thursday, July 31, 2014 at 3:22:28 AM UTC-6, Rhys Campbell wrote:
I occasionally get the following error in Kibana from elasticsearch
*1. *
Oops!ElasticsearchException[org.elasticsearch.common.breaker.CircuitBreakingException:Data
too large, data for field [@timestamp] would be larger than limit
of[639015321/609.4mb]]
I even get this when searching for <5 min of data through the Kibana log
stash search. It will disappear and reappear for the reason apparent to me.
My total data size is not that big yet.
Elastic search has been left on the default of splitting the indices by
day. The current size in 168M for today.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.