Hi,
Elasticsearch version: 6.6.2
I am trying to make a simple string query that fuzzies all the words and later on delete the fuzziness from the stop words using a character filter but it is not working.
example:
Create index:
with given stop words and a character filter
PUT test
{
"settings" : {
"number_of_shards" : 1,
"analysis" : {
"filter": {
"pl_stop" : {
"type": "stop",
"stopwords": ["i", "ma", "a"]
}
},
"char_filter" : {
"my_char_filter" : {
"type": "mapping",
"mappings" : ["i~ => i",
"ma~ => ma",
"a~ => a"
]
}
},
"analyzer": {
"default": {
"char_filter": "my_char_filter",
"tokenizer": "standard",
"filter": ["standard", "lowercase", "pl_stop"]
}
}
}
},
"mappings" : {
"_doc" : {
"properties" : {
"a" : { "type" : "text" },
"b" : { "type" : "text" }
}
}
}
}
After I add 2 documents
PUT test/_doc/1
{
"a": "php i mysql",
"b": "php i mysql"
}
PUT test/_doc/2
{
"a": "php i",
"b": "mysql"
}
I check if the analyzer works and it works correctly
GET test/_analyze
{
"analyzer" : "default",
"text": "php~ i~ mysql~"
}
=>
{
"tokens": [
{
"token": "php",
"start_offset": 0,
"end_offset": 3,
"type": "<ALPHANUM>",
"position": 0
},
{
"token": "mysql",
"start_offset": 8,
"end_offset": 13,
"type": "<ALPHANUM>",
"position": 2
}
]
}
But when i use the simple search query i get 0 results but should get 2
GET test/_search
{
"query" : {
"simple_query_string" :{
"query": "pap~ i~ mysal~",
"fields": ["a", "b"],
"default_operator": "AND"
}
}
}
=>
{
"took": 4,
"timed_out": false,
"_shards": {
"total": 1,
"successful": 1,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 0,
"max_score": null,
"hits": []
}
}
My question is, is there a way for me to fuzzy all the words given in a query and for elasticsearch to delete the fuzziness in the stop_words or delete them. (So i get 2 results in this example).
Or is the only way that I have to watch over the stop words and not let them be fuzzied?
Thank you for the answers in advance