I am using ElasticSearch for a few months in a project. The simplicity and
power it has is.. really amazing.
On this project we are using a VPS to host a normal LAMP stack +
ElasticSearch.
The problem is the memory usage.
Without anything the server is using: (cumulative values)
Nothing: 28MB
With MySQL: 159 MB
With Apache: 216 MB
With ES: 650~700MB
The thing is I'm only indexing around 10k documents. My config file (running
as a service) is:
I've checked and the server is started with the correct memory
configuration, but still after a couple of hours, 2-3 days the memory usage
just goes over the top and the server crashes.
So I need to be constantly restart the elastic search service.
I'm running a CentOS on a VPS with a limit of 1GB. Any more thing you would
like to know ask please.
Even tough this is nothing VERY serious right now, I'm worried that with
traffic spike the server would crash/restart very quickly. Can you help me?
On Fri, Jul 29, 2011 at 5:12 PM, Diogo Figueiredo diogoepf@gmail.comwrote:
Hi,
I am using Elasticsearch for a few months in a project. The simplicity and
power it has is.. really amazing.
On this project we are using a VPS to host a normal LAMP stack +
Elasticsearch.
The problem is the memory usage.
Without anything the server is using: (cumulative values)
Nothing: 28MB
With MySQL: 159 MB
With Apache: 216 MB
With ES: 650~700MB
The thing is I'm only indexing around 10k documents. My config file
(running as a service) is:
I've checked and the server is started with the correct memory
configuration, but still after a couple of hours, 2-3 days the memory usage
just goes over the top and the server crashes.
So I need to be constantly restart the Elasticsearch service.
I'm running a CentOS on a VPS with a limit of 1GB. Any more thing you would
like to know ask please.
Even tough this is nothing VERY serious right now, I'm worried that with
traffic spike the server would crash/restart very quickly. Can you help me?
Getting back to this.. Even if my Max memory is 64MB today it reached to
more than 300MB Res memory. Is that normal?
Because I set it up to Max 64 MB so I would expect it not to cross that
point..
When you set the max heap memory to 64mb, thats what it will control, its
the heap memory the java process will have. It does not count memory
associated with things like sockets, file handles and so on.
On Fri, Aug 5, 2011 at 2:15 AM, Diogo Figueiredo diogoepf@gmail.com wrote:
Getting back to this.. Even if my Max memory is 64MB today it reached to
more than 300MB Res memory. Is that normal?
Because I set it up to Max 64 MB so I would expect it not to cross that
point..
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.