Initially the application stated running on 1Gig or Ram, next day we upgraded to 2Gig and then 4GB and still continues to consume more and more RAM over time.
All it was indexing is just a 32.4 MB worth of data all together, have shared the index screenshot from the server.
The doc.count gets increased by 2 everyone min, which is being index from Mongo DB.
Even tried reducing the heap memory but nothing changed.
Below is the screenshot we took every 5 to 10 mins, you can witness the decrease in memory.
Please suggest any solution to stop elastic to consume more and more memory over time.
Please don't post pictures of text or code. They are difficult to read, impossible to search and replicate (if it's code), and some people may not be even able to see them
Are you talking about heap, or system memory?
Then that's outside the scope of what Elasticsearch manages, and you will need to consult your OS documentation for how to limit that (if that is what you want to do).
But it's not a bad thing, it's the OS caching commonly accessed files, which are related to the Elasticsearch process. It's why we recommend leaving half of the system memory, so this can happen and improve the speed of Elasticsearch.
We don't see any performance issues when it comes to elastic, but it takes almost 2GB of memory for handling 140 MB worth of Indexes. Is this a normal behavior, or is there any setting which we can tweak to optimize the memory usage.
(Other than heap memory). Please advise.
Again, this is the OS handling commonly accessed files. It's totally normal.
If you want to restrict it you'd need to consult your OS documentation.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.