Lars2
December 22, 2016, 2:57pm
1
Hey, i want to search my records case insensitive. I have the following mapping in my field:
"name": {
"type": "string",
"analyzer": "lowercase_analyzer"
},
Where lowercase_analyzer is:
"settings": {
"analysis": {
"analyzer": {
"lowercase_analyzer": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase"
]
}
}
}
}
If i now search on my name with anything else as lowercase, it does not find my records.
Is there anyway to have a case insensitive mapping?
geppo
December 22, 2016, 5:42pm
2
yes, you have to change your tokenizer from keyword to whitespace or standard, you can also set your language rules https://www.elastic.co/guide/en/elasticsearch/reference/current/analysis-lang-analyzer.html . keyword tokenizer, as described in documentation https://www.elastic.co/guide/en/elasticsearch/reference/5.0//analysis-keyword-tokenizer.html , take all the terms in that field and make them a single token, a keyword. This setting is useful when you index fields with few but significant words(es. New York and New Mexico as a single token, a code like ak234, as a single token,etc.
Lars2
December 23, 2016, 8:16am
3
Thanks for the reply.
The problem is that i still need to take all the terms and make them a single token. but i also need it to be case insensitive. is that possible?
system
(system)
Closed
January 20, 2017, 8:16am
4
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.