I downloaded Elasticsearch to try it and it ate 33GB of memory from the start without anything in it yet, I believe it is because it is made in Java and that's what makes it so hungry for resources, have Elastic thought of port their products to another language that's not that hungry like Python or PHP? because we decided not to use it because of this.
I really hope somebody is listening and think about it.
Elasticsearch used to by default be configured to use a 1GB heap. In recent versions the size of the default heap is instead calculated based on available resources and node roles. This assumes Elasticsearch is running on a dedicated host and has access to all resources and likely roughly follows the best practice of setting the heap to 50% of available RAM, and the heap is allocated on startup.
If you follow the link I provided you will find instructions on how to set the heap size to a smaller value. I would not recommend going below 1GB though.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.