Max docs per index limit

Hi folks,

is it possible to set a limit like max docs in index?
And maybe to kick old docs when the limit is reached?

I want to use elasticsearch with an in-memory index for caching.
My cache-docs get an TTL, but i want to limit the number of max docs in
index too, because i don't know the real number of docs which should be
cached.

Cheers and thanks
Vadim

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

I believe Lucene imposes a limit of 2 billions documents per index, but
that was using Lucene 3. Perhaps Lucene 4 offers more.

AFAIK, it is not possible to set an arbitrary limit in either elasticsearch
or Lucene. Imposing a limit would be difficult since the total document
count fluctuates greatly due to deleted documents and segment merging.

Cheers,

Ivan

On Wed, Aug 28, 2013 at 7:06 AM, Vadim Kisselmann v.kisselmann@gmail.comwrote:

Hi folks,

is it possible to set a limit like max docs in index?
And maybe to kick old docs when the limit is reached?

I want to use elasticsearch with an in-memory index for caching.
My cache-docs get an TTL, but i want to limit the number of max docs in
index too, because i don't know the real number of docs which should be
cached.

Cheers and thanks
Vadim

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Hi Ivan,
thanks for your answer.
This arbitrary limit was my point. I searched for something like Capped
Collections in mongoDB:

So maybe i have to choose something different for my "cache".
Cheers
Vadim

Am Donnerstag, 29. August 2013 20:51:01 UTC+2 schrieb Ivan Brusic:

I believe Lucene imposes a limit of 2 billions documents per index, but
that was using Lucene 3. Perhaps Lucene 4 offers more.

AFAIK, it is not possible to set an arbitrary limit in either
elasticsearch or Lucene. Imposing a limit would be difficult since the
total document count fluctuates greatly due to deleted documents and
segment merging.

Cheers,

Ivan

On Wed, Aug 28, 2013 at 7:06 AM, Vadim Kisselmann <v.kiss...@gmail.com<javascript:>

wrote:

Hi folks,

is it possible to set a limit like max docs in index?
And maybe to kick old docs when the limit is reached?

I want to use elasticsearch with an in-memory index for caching.
My cache-docs get an TTL, but i want to limit the number of max docs in
index too, because i don't know the real number of docs which should be
cached.

Cheers and thanks
Vadim

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

An accounting plugin could be fairly easy to implement for limiting
resources moved over the API, like number of shards, or volume of source,
even dps or qps rate limits per index should be possible (traffic shaping).
The idea is to save extra metadata from configuration in the cluster state,
and each node checks index-based limits.

Not a great fan of it but I know that commercial offers may like to be
based on rate limiting of all kinds.

Jörg

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.