Queries about indexing data into elastic search

I have some questions regarding indexing the data.

In my case, I needed to push 100 million JSON records into a single index on a specific date, and the date had to be included in the JSON field. I imported this data into PowerBI using the _search rest API. Then I'll push another set of data into the same index with a different date, bringing the total index size to 200 million records. I'll also push json into the same index throughout the year. Is pushing historic data into the same index the correct way to go? Will there be a memory problem in the future?

It depends - how big do you expect the data set to be in the end?

Hi @warkolm

A single JSON file containing push data would be around 30 MB in size. As a result, it appears that the index size will increase by more than 2 GB over the course of the year. Is there a better way to handle this situation?

You will be able to store that in a single index with no dramas.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.