hi, i have dedicated server with 128 gigs of RAM, acting as a single node cluster
Dual Intel® Xeon® E5-2620 V1 & 2 × 4TB HDD
server is being used for nginx,php,mysql and elasticsearch
java heap memory is set to 15 GB
main index is only one
kibana shows:
documents count 4.6mil
data 1.9 GB
current search rate 339.34 /s
i would like to know when should i add more nodes , right now set up is single node cluster.
how many queries per second my current setup can handle ?
Typically each node in an Elastic Stack should be dedicated and contain no additional applications. You'll get better performance if you separate the resources into multiple VMs. Since you have 128 GB of RAM, I would use something like this:
Elasticsearch (8 GB RAM)
Elasticsearch (8 GB RAM)
Elasticsearch (8 GB RAM)
Logstash (8 GB RAM)
KIbana / Nginx (4 GB RAM)
There's a limit on how much RAM you should allocate to JVM heap (which is 32GB) due to ordinary object pointers (OOP). Anything below 32GB on a 64-bit system is okay, and will allow Java to used compressed OOPs. If you cross that threshold it will use OOP, and be significantly less effective and you'll just be wasting memory, reducing CPU performance, among other things.
To test and scale your Elastic Stack, I would closely monitor the CPU / RAM / JVM performance. If JVM is consistently using more than 75% of its allocated memory, you might want to upgrade.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.