I have a use case were I am receiving a constant stream of documents (on
the scale of a thousand document every couple of seconds). and I have a
functionality were certain users are interested in only a subset of this
and wants to be notified if any new documents of the kind are added.
As it stands now I have a bulk being submitted every couple of seconds
using
https://github.com/elasticsearch/elasticsearch/blob/master/src/main/java/org/elasticsearch/action/bulk/BulkProcessor.java
I was wondering if it's possible to percolate in batch as well. i.e. after
submitting my batch, the response that I get can include queries and the
id's of documents that matched them or a certain field of my choice.
I was also wondering about the scale and validity of such a solution, i.e.
how many percolators can be present and how does the growth of such number
affect perf. in general.
Say I have a 1000 users, and each will have 2-3 queries that I need to
percolate against. Is this valid? is this too much? if this is not the way
to go. What would be the best way to go given that ES is the only place
this data lives permanently now.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
Bulk requests support percolation. You just need to set percolate field on
your IndexRequest. As far as performance is concerned, I would suggest
testing it.
On Monday, April 8, 2013 6:25:38 AM UTC-4, Mo wrote:
I have a use case were I am receiving a constant stream of documents (on
the scale of a thousand document every couple of seconds). and I have a
functionality were certain users are interested in only a subset of this
and wants to be notified if any new documents of the kind are added.
As it stands now I have a bulk being submitted every couple of seconds
using
https://github.com/elasticsearch/elasticsearch/blob/master/src/main/java/org/elasticsearch/action/bulk/BulkProcessor.java
I was wondering if it's possible to percolate in batch as well. i.e. after
submitting my batch, the response that I get can include queries and the
id's of documents that matched them or a certain field of my choice.
I was also wondering about the scale and validity of such a solution, i.e.
how many percolators can be present and how does the growth of such number
affect perf. in general.
Say I have a 1000 users, and each will have 2-3 queries that I need to
percolate against. Is this valid? is this too much? if this is not the way
to go. What would be the best way to go given that ES is the only place
this data lives permanently now.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
Perfect! Many thanks.
On Monday, April 8, 2013 10:04:09 PM UTC+4, Igor Motov wrote:
Bulk requests support percolation. You just need to set percolate field on
your IndexRequest. As far as performance is concerned, I would suggest
testing it.
On Monday, April 8, 2013 6:25:38 AM UTC-4, Mo wrote:
I have a use case were I am receiving a constant stream of documents (on
the scale of a thousand document every couple of seconds). and I have a
functionality were certain users are interested in only a subset of this
and wants to be notified if any new documents of the kind are added.
As it stands now I have a bulk being submitted every couple of seconds
using
https://github.com/elasticsearch/elasticsearch/blob/master/src/main/java/org/elasticsearch/action/bulk/BulkProcessor.java
I was wondering if it's possible to percolate in batch as well. i.e.
after submitting my batch, the response that I get can include queries and
the id's of documents that matched them or a certain field of my choice.
I was also wondering about the scale and validity of such a solution,
i.e. how many percolators can be present and how does the growth of such
number affect perf. in general.
Say I have a 1000 users, and each will have 2-3 queries that I need to
percolate against. Is this valid? is this too much? if this is not the way
to go. What would be the best way to go given that ES is the only place
this data lives permanently now.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.