I'm newbie to Elastic.
Assuming index size will have 500GB, let's say 100 users each 5GB of index data.
What hardware is needed for that? If querying/searching will be only performed always on about 1/100 of whole DB (let's say this 5GB for each user) can it run on smaller hardware or it will need anyway higher resources only becasue of database size.
I mean simply if there is strict connection that HW resources must grow with index size no matter what data we process.
Thanks, so even with 2500GB index size (more users like in example) and still 5GB per client/user 8GB od RAM would be enough? Trying to understand how it uses resources.
So it does not matter that first criteria (user) will have about max 5GB of data (independent on whole index size)
You say it might be required to run at all, or to run effecient /faster mean if required obligatory or recommended, I realize these can be aprox. estimations becasue query type matter but need basic understanding of engine.
Elasticsearch doesn't know that the users is logically segmented by user. It only sees indices, shards and documents.
So increasing the total data size by 5 times will mean increasing the heap size.
Ok clear, so it might be required not recommended.
Where I can found calculated recommendations?
For example I provide storage size I need and it gives me recommended setup.
Queriers can be little slower, main goal is have big data storage.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.