I am planning to ingest 2500-3000 EPS per day with three-month retention. I calculated approximately I need 60-80TB for that retention rate. I intend to have 2 Elasticsearch nodes, maybe 3. Do you have any advice or recommended architectures I can apply for my case ? Thanks
The amount of data you can store on an Elasticsearch node often comes down to how efficiently you can optimise heap usage. There are quite a few different things that consume heap space, e.g. indexing, querying, field data and shard overhead, and you need to find a balance that works for your use case.
I would recommend looking at this Elastic{ON} talk and read this blog post around shards and sharding to get a better idea about how to correctly size the system. Having said that, I doubt you will be able to handle the data volumes you mentioned on a cluster that size, especially as the nodes will be indexing data as well as serving queries.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.