How to control the data ingested into ElasticSearch

Hi,

  • I have decided to use ELK in the distributed environment.
  • I have a central machine where elasticsearch is running. There is no cluster used on elasticsearch.
  • Now to this central machine all the other machines(upto 100) send log contents using logstash.

My question is
1.how much data can the elasticsearch accept, index and store it.
2.Does it depend on the harddisc size?

Please provide me some inputs here if i am missing any understanding here.

And a bunch of other things like heap size, document and query type, version etc etc.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.