We have a series of documents with structures that are very flexible and entirely unpredictable. They can contain nested objects, and what's worse, a field (let's say "Title") can sometimes be a string and other times an object.
We'd very much like to be able to search these documents but it's extremely hard to configure a mapping that doesn't go around in circles in terms of limitations.
Firstly, with the default settings, the "Title" field above would fail to index when the mapping doesn't match (object vs string). Second, we can avoid all of this, by using "dynamic": "false" but in that case the fields don't seem to be analysed at all.
We'd prefer to be able to do a full text search on the value of "Title" when is a string, and ignore it otherwise, but that clearly is a challenge. And understandably so, because an index is structured with a certain assumption about the type of a field (either presented in a template mapping or inferred at runtime with the first indexed doc).
Lastly, the varieties of the schemas of these documents is in the range of hundreds and we keep adding to this. So trying to "figure out" a mapping or pre-process some fields is just not attainable.
I know this is a series of problems rather than a single topic and I understand the limitations inherent in the design of a search engine, including ES. But I would like to know if this is some recurring problem that has some partial solutions at least, of if it's purely something we can't avoid.