In some rare document instances, the size of this array could exceed 50k elements.
Note that when I'm doing the search request, my filter field (in the request body) will only contain 1 element (which would need to be compared to the 50k elements in such a document).
Is this something that Elastic indexes and Elastic App Search can handle? I did not find a maximum array size in the documentation.
I don't think you'll want to manually create that type of filter for performance reasons, but it sounds like you may be interested in document level security. This will take care of user access for you. If you're using Connectors to ingest data into App Search, we have a guide to set them up. Hope that helps!
@Kathleen_DeRusso Thank you for your prompt reply and for suggesting document level security.
Unfortunately I'm in a case where permission cannot be grouped based on role or a similar concept.
You may see the document as a class, and some users were individually assigned to these classes. In Search, users should only see classes they are assigned to.
I understand that 50k is very large, however, due to the nature of our application, we need to handle these large arrays in some instances, at least for now. So we are trying to assess the impact it may have beforehand.
My understanding is that Elastic App Search can handle such arrays, but that performance will be impacted, perhaps significantly. Is that correct?
Note that only a small fraction of the documents would contain such a large array. In general, the size would be much smaller. Would you say that those few items could affect search performance overall? Do you have a rough idea of the impact it may have?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.