We have a requirement to analyse and visualize 10 TB of data using ELK and we currently have 3 physical servers(RHEL based) to perform this task. Please suggest the best and different ways to install elasticsearch/kibana/logstash and best practises.
For Example:
Install Elasticsearch, Kibana, Logstash on 1 node each.
Install Elasticsearch, Kibana on 2 nodes, Logstash on 1 node.
....etc
No limit on the CPU/Memory/Storage capacity and can meet requirements.
Put an Elasticsearch instance on each server to make a three node cluster as spreading your data over all three servers will make for faster querying. Make all the instances master eligible with at least two required to be available to form a cluster to avoid split braining. There's no correct answer to how many Kibana and Logstash instances to use because it depends on what you're doing. Probably go with one of each unless you see reason to have more than one.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.