And I have a set of texts processed. Among those there is for example the Swedish word "interkontinental". My processing is working somewhat in that I do find "interkontinental" if I search for exactly that. But if I search for "inter" or "kontinental" or "kontinent" I do not find it.
I am running Elastic embedded. Is that an issue? Since I provide the list of word in the settings, i.e. not referencing a separate file, I would not expect it.
Am I expecting the wrong thing? This is not acceptable. The word "fastland" (Mainland in english) occurs in the texts but that is not found when searching for "land".
This seams as a small and simple case to me. Any help is very appreciated. The order of the filters in the analyser has been tested with no different result.
I tried your example and it works fine. Are you sure that your index is created with the correct settings, the analysismust be set under settings. Here is the recreation I tried:
Thanks Jim
The analysis was under settings but I had no mappings. I believe the mappings specify that when a "doc" is indexed the "swedish_l" analyser shall be used. When I did what you said it works. What I then need is an extensive list of words. My idea was to use "word_list_path" and point to it. When looking around on the net I found that the hunspell dictionary seams most used. It is also incorporated in the elastic source code which is reassuring. Since the format of each line for forms of words is different from what the "word_list"- property held I concluded that the hunspell plugin must be used with that list. What I now have is
Before it was possible to execute I got an exception about that the "se_SV" dictionary was not possible to load. I fixed the path and converted the file I had in ISO-8859-1 to UTF-8.
did not work and the file layout specified as "conf/hunspell" did not work. By looking at the ES-sourcecode in core/src/main/java/org/elasticsearch/env/Environment.java on line 103 I changed to "config/hunspell" and then it works with the condition that the se_SV folder is at $path_home/config/hunspell/se_SV.
My questions:
In general, would you say I am on the right track?
The "fastland" example does not work. Any idea why?
I have some insights since my last post and writes something in case someone is about to respond to this. In retrospec my own point of view on my post is that it is somewhat confusing so it makes good sense for me to explaimn myself.
I thought that words like "fastland" and "interkontinental" would spontaneously be broken down to subwords by some internal well crafted machinery into proper subwords. When I wrote "fastland does not work" what I had in mind was that it did not get analysed with tokens "fast" and "land". I only got "fastland" back. In the case with "fastland" it is not certain that this is the wanted thing but with "interkontinental" there should be some hit when searching for "kontinental".
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.