Supporting very high number of query terms a good idea?

We have a postgres table with nearly a million rows that we index into elasticsearch. We want to support a specific use case of allowing a user to copy and paste nearly 10k unique identifiers to filter for specific entries for further processing. However, elastic documentation recommends not raising the maxClauseCount unless absolutely necessary due to heavy resource utilization. Has anyone had any experience raising it anyways or found other workarounds? Initially my thought is to not support this use case with elasticsearch, but through other means along the lines of a batch processing implementation.

Welcome to our community! :smiley:

Based on the topics I have seen here, that recommendation is pretty realistic, where massive terms lists cause performance issues.
If you have an alternative method it might be worth investigating.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.