hi,
i'm using ELK stack.
I try to collect about 120 million data a day.
I am considering the number of physical servers in Logstash server and Elastic Search server.
I want 120 million data stored a day without error.
Can I get a guide on the number of physical servers?
What type of data is it? What is the average document size? How long do you need to keep the data? Do you require the cluster to be highly available? How will you use/query this data? How many concurrent queries do you need to support? What about query latency requirements?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.