I need to store multiple datasets in Elasticsearch, each containing a high number of documents (something between 100,000 and 1,000,000). They have essentially the same structure. Since I use /analyze only one dataset at a time, I think it would be a good choice to create an index for each of them. What do you think?
How many dataset do you expect to have? What is the average document size?
Probably something like 20 datasets.
Each dataset could be 1 Go to 15 Go.
Then an index per dataset sounds reasonable.
Excellent, thank you!
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.