We are getting hit by occasional "too_many_clauses" and I believe the issue is caused by our extensive use of synonyms expansions, where some words are expanded into quite a few others.
But I don't quite understand the details on how it gets expanded beyond 1024 clauses.
For starters I cannot replicate the error on my local test machine - difference being that the test setup doesn't have much content in the index. So are the final number of clauses depending on the index' term distribution or something? Ie. words in the expansions are being discarded if the term/token doesn't exists in the index before building the query?
Secondly, this problem didn't start showing until we moved away from "synonyms token filter" and used "synonyms graph token filters" instead. Is there a "blowup" in clauses with this move?
And lastly is there somehow a way to view the final query with all its clauses? Would be helpful in understanding what is going on.