we're currently using elasticsearch as our main datastore and for search capabilities.
We have an internal serialisation library that puts a few metadata fields into every object, objectType and objectVersion.
We need to add geo_point functionality to our application. So, we created a new datatype, geoPoint, which contains two fields: 'lat' and 'lon'. Since our custom library adds metadata, it will be serialized to:
When trying to index a document with this a field of type 'geo_point' and the above structure, we get an error regarding the extra fields.
Our deserialisation library depends on the metadata and will fail to deserialise if the fields are missing, which is why we cannot remove the fields in our app before indexing.
Trying to find solution, we found that it is solvable by a transform script in the mapping: it can remove extra fields and the result will not be stored, so deserialisation will work fine.
Unfortunately the feature is deprecated in 2.0 and removed in 5.0.
Could you recommend another way to achieve a similar behaviour?
Maybe there's a way to make an analyzer that will remove the extra fields?