I have some very simple settings for an index that I'm trying to use to test my analyzer with:
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"applog": {
"properties": {
"user_id": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
I simply want to confirm that my user_id property for my applog document type will not get tokenized.
However, after confirming that the mappings are indeed in place:
curl localhost:9200/applog-test/_mapping?pretty
{
"applog-test" : {
"mappings" : {
"applog" : {
"properties" : {
"user_id" : {
"type" : "string",
"index" : "not_analyzed"
}
}
}
}
}
}
My analyzer appears to still be tokenizing the user_id... hopefully this is user-error?
curl -XGET "localhost:9200/applog-test/_analyze?user_id&pretty" -d "this is a test"
{
"tokens" : [ {
"token" : "this",
"start_offset" : 0,
"end_offset" : 4,
"type" : "<ALPHANUM>",
"position" : 0
}, {
"token" : "is",
"start_offset" : 5,
"end_offset" : 7,
"type" : "<ALPHANUM>",
"position" : 1
}, {
"token" : "a",
"start_offset" : 8,
"end_offset" : 9,
"type" : "<ALPHANUM>",
"position" : 2
}, {
"token" : "test",
"start_offset" : 10,
"end_offset" : 14,
"type" : "<ALPHANUM>",
"position" : 3
} ]
}
My expectation was that this would all be in a single token?
Bonus points: how do I create the desired behavior in a dynamic_mapping? I ran into the same issue with the dynamic mapping and/or default mappings.