I have also tried to change the tokenizer to keyword but it gives the same results as before.
[Just to explain, essentially what I am trying to do is make a full text search (which runs fine with the default mapping) but I want to implement an exact match if the user puts the search query between quotes, (the query could be a word of phrase) maintaining case sensitivity]
So before thinking of modifying things in FSCrawler, you need to find the right analyzers, mapping for your use case.
The easiest way is to start with some sample minimalist documents (just one field) and play around with the API.
Once you have a full script which reproduces the problem as described in About the Elasticsearch category, share it here. It will help to better understand what you are doing. Please, try to keep the example as simple as possible.
A full reproduction script will help readers to understand, reproduce and if needed fix your problem. It will also most likely help to get a faster answer.
I am having a challenge implementing a search analyzer for the content field so that it depends on the query for the results i.e the query is a string and if it is without quotes, it gives all results not considering cases (currently working).. but if you have the query between quotes, then it gets only exact matches (could be a word or phrase).
I tried not analyzing the field but that doesn't work. Neither does normalization.
I truly appreciate your feedback .. kindly guide me on the steps, something like changing the analyzer so that I can do both partial and exact matches depending on the query.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.