I'm newbie to Elastic.
Assuming index size will have 500GB, let's say 100 users each 5GB of index data.
What hardware is needed for that? If querying/searching will be only performed always on about 1/100 of whole DB (let's say this 5GB for each user) can it run on smaller hardware or it will need anyway higher resources only becasue of database size.
I mean simply if there is strict connection that HW resources must grow with index size no matter what data we process.
Welcome to our community!
You'd really want to test, but an 8GB node should be able to handle that.
Thanks, so even with 2500GB index size (more users like in example) and still 5GB per client/user 8GB od RAM would be enough? Trying to understand how it uses resources.
No, you will probably need more then. Maybe 16-32GB.
There's no set resource usage graphs that we can use here, as a lot of it depends on your data structures, query types, version etc.
So it does not matter that first criteria (user) will have about max 5GB of data (independent on whole index size)
You say it might be required to run at all, or to run effecient /faster mean if required obligatory or recommended, I realize these can be aprox. estimations becasue query type matter but need basic understanding of engine.
Elasticsearch doesn't know that the users is logically segmented by user. It only sees indices, shards and documents.
So increasing the total data size by 5 times will mean increasing the heap size.
Ok clear, so it might be required not recommended.
Where I can found calculated recommendations?
For example I provide storage size I need and it gives me recommended setup.
Queriers can be little slower, main goal is have big data storage.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.