Let's look at the built in analyzer named "persian"
var analyzeResponse = client.Indices.Analyze(a => a
.Analyzer("persian")
.Text("word1 words2 ManyWord3 LargeWOrds4 SmallWord5 چهار")
);
These are some persian words, take it easy, but I do not know why there is a bug in word "چهار" but no problem in indexing other words !
Let's explain more: Suppose the persian analyzer Tokens some words in a sentence
"Hello World I am noob", but it doesn't tokenize word "bye" !! but it Tokenize "Eye" or "bie" or "byk" or "eby" or "byed" . Now the same example in persian analyzer, it specially cant Index or Tokenize word "چهار" !! This word has four characters, changing only one of the characters or even changing the order of the characters in this special word, resolves the problem
"چهار' means "four" in persian. I have problem only and only with this word !! and no problem with many other thousands words !!
It took me one day to understand that my codes in ASP.NET Core and Elastic Search Nest, for mapping and indexing are correct and I have problem only with this special word !!
I got a new point, when I add this special word in my synonym.txt list to be used by my custom analyzer, It can't start Elastic Search (Error is syntax error in synonym file)!! this word is bothering