Hi, I'm desperate. I have the following mapping:
{
"my_index": {
"aliases": {},
"mappings": {
"a_c": {
"properties": {
"id": {
"type": "string"
},
"s": {
"type": "nested",
"properties": {
"c_r": {
"type": "nested",
"properties": {
"c": {
"type": "string"
},
"end": {
"type": "long"
},
"id": {
"type": "string"
},
"start": {
"type": "long"
}
}
},
"g": {
"type": "string"
},
"id": {
"type": "string"
}
}
}
}
}
},
"settings": {
"index": {
"creation_date": "1505476515647",
"number_of_shards": "5",
"number_of_replicas": "1",
"uuid": "_0IiQCPrQ1i-kDP1481y8w",
"version": {
"created": "2030099"
}
}
},
"warmers": {}
}
}
I try to make an insert of new c_r inside my elastic, and I do it using a groovy script (my elastic is a version 2), but I have a serious problem: when I have many s's Elastic hangs during insertion. I suspect that my problem is indexing, because the number of c_r's within the s's is very large. Is there any way to do the insertion avoiding indexing? Neither can I disable the indexes of: "c", "start", "end", because I need them for other searches.
The script is :
{"script": " boolean f = false;\nfor (int i = 0; i < ctx._source.s.size(); i++) "
"{if (ctx._source.s[i].id == s.id)"
" {ctx._source.s[i].covered_region = s.c_r;"
"f=true;break;}}\nif(!f){ctx._source.s.add(s);}", "params": params}
In params I have a s with a lot of c_r's
Any ideas? Thanks!!!