Good evening, I'm been using Elastic Search for a while and I need some help with my arch design, basically I'm using Elasticsearch for storgin dynamic forms which can have an (n) amount of fields, these forms are stored in "tables", or boxes called Boards (Our application is a Kanban) and I need to store the fields from that dynamic form with their own type, not just a generic type like "Text"
For example, a field of the type "dueDate" NEEDS to be an date, I can't let him be text otherwise I can't make date operations using Elasticsearch query language,
For now, I store all the forms as documents and the fields as key-value fields on the documents, they are all logically separated by a boardId, when I need to search an form, I'll search it with my boardId and then retrieve, my problem is, for futher features in our platform, such as Dashboards and Data Analysis with A.I, I NEED to index these dynamic fields, otherwise, they are basically useless and I won't be able to provide a more concise experience to the user, for example, the user won't be able to use their own form to create their dashboards and search their forms basing themselves on the fields they created on their forms.
Here are some questions that some readers of this post might have:
Q1: "Why don't you simply map it ?"
R: Cause it's dynamic, and I store the ID as the key of the field, therefore, I can't map it cause I will never know it's name, since it's literally completely random.
Q2: "Have you tried the Dynamic Mapping?"
R: Yes, actually, I'm using it for a strategy that I've made, but I will discuss this later. Even using the dynamic Mapping, the amount of fields can literally be infinite, therefore, I can't measure the quantity of fields that I a form created by an user will have. "Why is this a problem?", because i'm storing everything in a single index, and a index has limits it's fields length. "Can't you simply increase the number of fields of the index?", I could do it, but who knows if in the future I will need to increase my fields limit to 100000 just because it's already full. I can't really measure in the moment the costs that this action would create (I've already seen in the official docs that this is an temporarily solution, not a "Go-to")
Q3: "So what's your actual question?"
R: My question is, I've built an arch solution for this problem, basically, I will create an Index for each board the user create, this will increase my indexes length, but at least will permit me to use dynamic mapping, the benefits are, I can map all the dynamic fields from the user, the only thing we need to do is make sure the type that is going to Elastic can be recognized as type from the dynamic mapping, I'm even using a dynamic template to transformd date fields
{
"mappings": {
"dynamic_templates": [
{
"dates_iso8601": {
"match_mapping_type": "string",
"match_pattern": "regex",
"match": "^\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}Z$",
"mapping": {
"type": "date"
}
}
}
]
}
}
This helped me a lot and actually resolved my issue
And, I could map all the fields and SEARCH my documents by them, since it's all separated own their own indices.
Finally the question, for my use case, is that arch solution suitable ? It may don't scale, but I can't see any other alternatives, if anyone reading this post that has a better knowledge and experience in Elastic Search could help me, I Would really appreciate it.
My Elastic Search version: 8.6.0
Using the elastic cloud solution
Thanks for reading