Hi guys,
I have a cluster running and I have run into a problem involving including special characters in my search query. Now I did not setup the mapping for the index the mapping is dynamic and the analyzer is also standard. The information about the analyzer I got from the "Analyze API".
>
> GET /<index>/_analyze
> {
> "text": "some-data"
> }
gave me the output
{
"tokens": [
{
"token": "some",
"start_offset": 0,
"end_offset": 4,
"type": "<ALPHANUM>",
"position": 0
},
{
"token": "data",
"start_offset": 5,
"end_offset": 9,
"type": "<ALPHANUM>",
"position": 1
}
]
}
my current mapping
message: {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
Because of this output I came to the conclusion that the analyzer here is standard and the special character is not indexed. Now when I try to include special character in my search query I am not getting the desired results and when I am trying to use another analyzer in search query I am receiving null results.
Now My question is do I have a way to include special character in my search query any other way other than changing my mapping.