How to search special characters in elasticsearch

I want query terms name in string "9.999%". But it not working.
setting document
'analysis' => [
'analyzer' => [
'default' => [
'type' => 'custom',
'tokenizer' => 'whitespace',
'filter' => [
'lowercase',
'word_delimiter',
],
],
],
],

This is how it's analyzed:

POST _analyze
{
  "text": "9.999%",
  "tokenizer": "whitespace",
  "filter": ["lowercase", "word_delimiter"]
}

Gives:

{
  "tokens" : [
    {
      "token" : "9",
      "start_offset" : 0,
      "end_offset" : 1,
      "type" : "word",
      "position" : 0
    },
    {
      "token" : "999",
      "start_offset" : 2,
      "end_offset" : 5,
      "type" : "word",
      "position" : 1
    }
  ]
}

I'm not exactly sure about your use case.

Could you provide a full recreation script as described in About the Elasticsearch category. It will help to better understand what you are doing. Please, try to keep the example as simple as possible.

A full reproduction script is something anyone can copy and paste in Kibana dev console, click on the run button to reproduce your use case. It will help readers to understand, reproduce and if needed fix your problem. It will also most likely help to get a faster answer.

Thanks. I understand

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.