i am using last one month on words elasticsearch, kibana, logstash (** elasticsearch-5.0.2** ), for monitoring logs purpose, i am able to see last one month log historical report ,and i right click on elasticsearch folder size is around 5 GB.
my question is, up to how much GB elasticsearch supported (my OS is Windows and C: drive capacity around 250 GB).
But it comes down to your hardware ( # of systems, speed of the drives, number of indexes/shards . and how fast you want it to run )
there is no one absolute rule about how much it can handle.
I would see little reason it would not handle your entire drive, but your queries maybe slow ( especially if your using a 5-7k RPM drive) also if your doing only simple searches I would imagine your going to be ok as well. But if your going to make Aggregation, that need lots of memory, you may have problems.
sorry there is no absolute answer, some of it is how your going to use it and what your willing to accept.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.