I have created new index named tbxmlcs
Then I have added mapping:
{
"properties": {
"34084-4": {
"type": "keyword",
"fields": {
"keyword":{
"type":"keyword",
"ignore_above": 256
}
}
}
}
}
After that I tried to get some filtered data from old index to the new index:
POST _reindex
{
"source": {
"index": "tbxmlfile",
"query": {
"bool": {
"must": [
{
"terms": {
"34084-4": [
"copd"
]
}
}
]
}
}
},
"dest": {
"index": "tbxmlcs"
}
}
I am getting below error message:
{
"took" : 4503,
"timed_out" : false,
"total" : 183,
"updated" : 154,
"created" : 0,
"deleted" : 0,
"batches" : 1,
"version_conflicts" : 0,
"noops" : 0,
"retries" : {
"bulk" : 0,
"search" : 0
},
"throttled_millis" : 0,
"requests_per_second" : -1.0,
"throttled_until_millis" : 0,
"failures" : [
{
"index" : "tbxmlcs",
"type" : "_doc",
"id" : "z8cqgXcBwCxB52uVeNre",
"cause" : {
"type" : "illegal_argument_exception",
"reason" : "Document contains at least one immense term in field=\"34084-4\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[60, 115, 101, 99, 116, 105, 111, 110, 32, 73, 68, 61, 34, 98, 49, 53, 101, 53, 97, 101, 56, 45, 51, 48, 97, 49, 45, 52, 98, 102]...', original message: bytes can be at most 32766 in length; got 36716",
"caused_by" : {
"type" : "max_bytes_length_exceeded_exception",
"reason" : "bytes can be at most 32766 in length; got 36716"
}
},
"status" : 400
}
}
I have tried increasing the memory of elastic search and also tried to update zones, still the same issue.
I want to use case-sensitive search: for example my field `34084-4` should search only COLD but not Cold or cold or any other combination.
I have tried for exact match also, still, it did not work.
I dont want to reload the data again with new mapping.
Can you please guide me through this.