Estimating minimum memory requirement for my dataset


I have an online store with ~3000 products with 4-8 specs to each one of them (eg. "weight", "size"). I want to use Elasticsearch in order to provide faceted navigation to my users (eg. combine filters to narrow down the results). So I'll have a single index with the IDs and the specs of each product.

I don't think I'll need full-text search yet.

Can you recommend on what is the minimum memory required for such a use-case?

Thanks in advance!

As a rule, I wouldn't recommend less than 4GB per node when in production.