I've just implemented an autocomplete suggestion using Elasticsearch which works like a charm.
The index-size is just about 100kb, containing about 500 filter-combinations which clients then use to make further requests to the webserver.
Considering that Elasticsearch uses about 1.2Gb of memory I feel like using a
sledgehammer to crack a nut here.
Is Elasticsearch suitable for small datasets like this? Or are there other more lightweight frameworks for such a use case?
I really could not find much on this topic, in other threads people consider datasets of several Gb to be small.