I have setup an index with the following settings:
{
"settings": {
"analysis": {
"analyzer": {
"html_analyzer": {
"type": "custom",
"tokenizer": "standard",
"char_filter": [
"html_strip",
],
"filter": [
"lowercase",
"asciifolding",
"stop",
]
}
}
}
}
}
What I’d like to do know is add a list of custom multi-word phrases to be included as tokens. How can I do that?
For example, let’s say that these are special words:
- three musketeers
- mini pekka
When searched, I want it to find documents with "three musketeers" and not look for three, and musketeers individually.
Thanks!