We are using Elasticsearch 1.7.3 for our application. Our ES cluster has 3 physical servers with 48 cores of CPU & 256 GB RAM. We are having 3 master nodes, 6 data nodes & 1 client node and having nearly 6 TB of DATA. The cluster has 3 indices es_chr, es_cval & es_item and for es_chr shard allocated as "1" and for others like es_cval has "6" and es_item has "18" primary shards. We have given "2" replicas for all the indices.
We used marvel for monitoring the cluster and also using shield for authentication purpose. Initial stage we did bulk index for all indices to load the data into ES and then afterwards whenever the changes are happened to the data we normally update the changes through sync process to the ES.
Every day elasticsearch generates own logs files based on the nodes. How to collect the Elasticsearch logs with the help of logstash and visualize it to kibana?
Note:- As I am new to logstash and suggest best practices in order to make this possible
Please guide us.