Which Spring Data design when the total_fields limit is reached?

Hi,

We are using Elasticsearch through Spring Data and we reached the default "index.mapping.total_fields.limit".

We are indexing 5 different types of documents (via Java types, not index types). Each document has many specific attributes and some commons ones (like "owner" or "user").

It could be easy to set a higher limit but we plan to manage 50+ types of document nearly. We are not feeling good to change the limit every time.

We tried to split each document in its own index but the implementation of the service smells because we want to save a list of documents or find all the documents of a specific user (via a multi-index search).

For example, we have 2 indexes (invoices and bills) and their appropriate Java types ("Invoice" and "Bill" with the same parent type "Document").
When we search all the documents of a user, we want a response with objects in the appropriate type "Invoice" and/or "Bill".
When we save a list of "Document" ("Invoice" and/or "Bill"), each document must be stored in its own index.

Does it exist an easy way with Spring Data or other to save a list of documents with each document in its own index without having to write a huge switch on java types?
Does it exist an easy way with Spring Data or other to search in many indexes at the same time and having the appropriate Java type in the result?

1 Like

I'd probably use different indices. One per type and use the same foundation fields for all common fields.
Like the user as you said.

Would be easier to iterate with you on that if you could provide a more concrete example with actual fields using the json representation...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.