I am running a low budget website, that is sitting on a windows VM with 2gb of ram.
The current memory footprint of the OS and website in total is around 1.3 gb.
The reason I want to use Elasticsearch is to have faster access to products, that currently live in a sql server. There is under 1000 products, so the document count is low, but I might want to do some aggregations for filtering etc.
Will elasticsearch work well with around 500mb of ram available? Will it even run? Is there a different product which might be a better fit for a low volume/memory document DB?
Elasticsearch run on allocated heap size . You can modify it via editing heap size in jvm options parameter mentioned in elasticsearch file located at bin folder of your elasticsearch installation folder.
I would think that if you're having performance problems with an RDBMS with such a tiny amount of data you're doing something wrong, and my first approach would be to find and fix whatever that "something" is rather than move to a whole new technology stack.
The SQL schema is not mine. It's created by the eCommerce platform we are using, which does NHibernate magic.
I could of course store my own model in a separate table in SQL, but the SQL Azure instance is not very quick, even for simple queries. That is why I want to bring a fast datastore closer to my application for entities that I am using a lot.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.