Why was the "tokenizer" parameter to the synonymTokenFilter and synonymGraphTokenFilter deprecated with ES 6.x?

Per:
The Elastic Documentation the "tokenizer" parameter is officially deprecated, but from a functionality perspective, it seems to have been completely removed as the parameter is ignored in new indices.

This is a pretty drastic change. For the normal case, that saves a lot of pre-processing on the synonyms file that most people probably weren't doing. However, manipulating punctuation has been a historical work-around to assorted problems that pop up with phrasing, and there are plugins that depend on this behavior.

I'm not asking about new workarounds to a specific problem, just curious about the reasoning behind disabling the functionality entirely, rather than changing the default behavior.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.