Thank you for your reply.
We need more info to answer this.
How important is the reliability? How bad is it if you loose the data? How often do you have to access the data? How many people search through the data? How often do you need to move the whole indexes to other computers?
Now we are planning to enable for very few applications. SO we will user maximum, 4,5 users and daily 4,5 times for application logs. We will generate graphs and reports ,3,4 times in a day.
Actually no business impact even if it down but we need this data for production support incase if we receive any complaints from client for investigation.
Overall you have very little requirements and compared to other users of ELK this is like a walk in the park. 30GB of data in 30Days is nothing.
NO, now its a for small application but our main aim is to enable ELK for all the application. We have more that 90 GB data per day and we need to keep this data for 30 days.
If data safety is important there will be no way around a 3 Node Cluster. Any minimum server should be good enough. Give it 8GB of RAM and 500GB SSD or HDD (Depends on your speed needs), a normal modern processor and you are good to go.
If data safety is not to important just use a single node cluster and snapshot the data to another server.
As I said, you have very little requirements and not a lot of data, so not much hardware required.
You can easily run Kibana on the same servers as elastic because you have little traffic anyways
OK, Can you suggest me servers configuration if i have 10GB data per day and we need to keep 30 days in server. Data is important and 10 users will use this data for production investigation.
Based on this calculation and performance we will calculate and set up for my main applications( 90GB date per day).
PLease suggest configurations and the number of servers required.