Why is Elasticsearch so hungry for resources?

Hey guys,

I downloaded Elasticsearch to try it and it ate 33GB of memory from the start without anything in it yet, I believe it is because it is made in Java and that's what makes it so hungry for resources, have Elastic thought of port their products to another language that's not that hungry like Python or PHP? because we decided not to use it because of this.

I really hope somebody is listening and think about it.

Elasticsearch used to by default be configured to use a 1GB heap. In recent versions the size of the default heap is instead calculated based on available resources and node roles. This assumes Elasticsearch is running on a dedicated host and has access to all resources and likely roughly follows the best practice of setting the heap to 50% of available RAM, and the heap is allocated on startup.

If you follow the link I provided you will find instructions on how to set the heap size to a smaller value. I would not recommend going below 1GB though.

Welcome to our community! :smiley: We aren't all guys though.

It'd be handy if you elaborated a little more on what you exactly did.

It's a fair bit more complicated than that unfortunately, and unlikely to happen.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.