Elasticsearch user dictionary space include

Hello.
I have problem with Nori user dictionary. I want to register words that include space, but it's not working.

I tried to register in the dictionary by including words in double quotes or even in small quotes, but it was not resolved.

request

PUT my_nori
{
  "settings": {
    "analysis": {
      "tokenizer": {
        "my_nori_tokenizer": {
          "type": "nori_tokenizer",
          "user_dictionary_rules": [
            "코끼리 열차"
          ]
        }
      }
    }
  }
}

GET my_nori/_analyze
{
  "tokenizer": "my_nori_tokenizer",
  "text": [
    "코끼리 열차"
  ]
}

response

{
  "tokens": [
    {
      "token": "코끼",
      "start_offset": 1,
      "end_offset": 3,
      "type": "word",
      "position": 0
    },
    {
      "token": "열차",
      "start_offset": 4,
      "end_offset": 6,
      "type": "word",
      "position": 1
    }
  ]
}

It's the similar problem as the previous topic, but there was no solution.

Please let me know if there is a solution!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.