Hello @Christian_Dahlqvist,
I've discovered the source of this issue. It seems that it's the bounding box that is at fault, though I've no idea why.
Once I remove the bounding box from the data being ingested, the index is a normal size (600 documents --> 550kb), but as soon as I add the bounding box back in (with a brand new index), the size skyrockets (3,593 documents --> 1.6GB) with only 84 documents containing a bounding box.
Below is the JSON of the bounding box:
"placeBoundingBox": {
"type": "polygon",
"coordinates": [
[
[
-71.191421,
42.227797
],
[
-71.191421,
42.399542
],
[
-70.986004,
42.399542
],
[
-70.986004,
42.227797
],
[
-71.191421,
42.227797
]
]
]
}
The mapping associated with the bounding box (from calling GET /INDEX_NAME):
"placeBoundingBox": {
"type": "geo_shape",
"tree": "quadtree",
"precision": "1.0m"
}
To demonstrate that the mapping does infact work and is creating a proper geo_shape (even though Kibana doesn't recognize it as a geo_shape), I ran the following query and got back a successful hit:
GET /_search
{
"query": {
"bool": {
"must": {
"match_all": {
}
},
"filter": {
"geo_shape": {
"placeBoundingBox": {
"shape": {
"type": "polygon",
"coordinates": [
[
[
-71.191421,
42.227797
],
[
-71.191421,
42.399542
],
[
-70.986004,
42.399542
],
[
-70.986004,
42.227797
],
[
-71.191421,
42.227797
]
]
]
},
"relation": "within"
}
}
}
}
}
}
I'd like to have the bounding box kept in, is there something wrong with either the mapping or the data? Is 1.0m too fine-grained?
Thank you.