Is there a serious concern if I set max_expansions to a very large value say 5000?
I have ~100 millions documents and I search on phone number prefixes (stored as sequence of digits). As data has grown, search accuracy has gone down and now sometimes I have to enter 6 or 7 digits to get matches for search.
I would like to quick fix by increasing the property max_expansions to 5000 in my query for search, taking a performance hit. Before I go with n-gram tokenizer.
Are there any serious concerns with such a large value of max_expansions (from 50 to 5k)?