We are analyzing ES for storing our log data (~ 400 GB/Day) and will be
integrating Logstash and ES. What is the maximum amount of data that can
be stored on one node of ES ?
Depends.
How much disk do you have? RAM? CPU? Java version and release? ES version?
What's your query load like? Are you doing lots of aggregates or facets?
The best way to know is to start using ELK on an platform indicative of
your intended server size and then see how much data a single node can
handle, then extrapolate.
We are analyzing ES for storing our log data (~ 400 GB/Day) and will be
integrating Logstash and ES. What is the maximum amount of data that can
be stored on one node of ES ?
We are planning to use >1 TB SSD for our nodes, typical node machine with
24 core CPU and ~ 200 GB RAM. Our ELK stack : Elastic Search v1.3.2 ,
logstash 1.4.1 with Java 1.7 and Kibana 3.0
Depends.
How much disk do you have? RAM? CPU? Java version and release? ES version?
What's your query load like? Are you doing lots of aggregates or facets?
The best way to know is to start using ELK on an platform indicative of
your intended server size and then see how much data a single node can
handle, then extrapolate.
We are analyzing ES for storing our log data (~ 400 GB/Day) and will be
integrating Logstash and ES. What is the maximum amount of data that can
be stored on one node of ES ?
Regards,
Gaurav
--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.