Hi Team,
I have following schema in my index, when i am trying to add content in field with large length it is giving me following error.
Mapping
{
"fields": {
"type": "nested",
"properties": {
"uid": {
"type": "keyword"
},
"value": {
"type": "text",
"fields": {
"raw": {
"type": "keyword",
"ignore_above": 32766
}
},
"copy_to": [
"fulltext"
],
"analyzer": "html_strip_analyzer"
}
}
},
"fulltext": {
"type": "text",
"analyzer": "html_strip_analyzer"
}
}
Error:
Document contains at least one immense term in field=\"fields.value.raw\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[87, 104, 101, 110, 32, 117, 115, 105, 110, 103, 32, 82, 111, 98, 108, 111, 120, 32, 83, 116, 117, 100, 105, 111, 39, 115, 32, 115, 99, 114]...', original message: bytes can be at most 32766 in length; got 32778
Elasticsearch version: 6.3.1
Even after specifying ignore_above document is getting failed on create command.