Elasticsearch (2.3.4), ignores not_analyzed even though inserted during index creation

created a new index along with the following mapping that turns off analysis.

curl -XPUT localhost:9200/vacation/ -d 
     '{ "mappings": 
          { "rentals": 
               { "properties": 
                     { "name": { "type": "string", "index": "not_analyzed" }} 
               }
          }
       }'

This request is acknowledged and I inserted a new document called in the index:

{ "name": "fire house" }
Since analysis is turned off, the document name, "fire house", shouldn't be tokenized. So unless the query is exactly "fire house," the document shouldn't be retrieved. However, the document is being retrieved even though the query term is just "fire" or "house." Could this be an issue with the current version of E.S. I'm using or is it the way I specified mapping during index creation?

Thank you in advance.

Could you show how to index your data?

FYI: you can see how it works analyzers and field using _analyze API

This field would by default be included in the _all field, which is analysed. Do you specify the field name in your query?

I'm just applying this command:

post localhost:9200/vacation/rentals/1 -d '{ "name": "fire house"}'

Hello Christian, this what I entered in my search query { { "term" : { "name" : "fire" } }}. I specified the name field as you can tell, and yet the document with the name "fire house" continues to be retrieved.

Christian, I tried the search query again with field match, and it finally worked. It's strange, the first time I tried it didn't work. But now it does :slight_smile: