Multiple filebeats to logstash

Hello there,

I'm new to ELK on AWS, I have set up an ELK cluster on one node/EC2 instance (may expand later). Also I set up a linux server(EC2 instance) with filebeat and metricbeat running and sending logs and metrics to Logstash. I can see data showing up on Kibana from this Server where filebeat and metricbeat run on.
Now I have set up filebeat and metricbeat on other EC2 instances(linux and Windows), my questions are:

  1. When setting up filebeat and metricbeat on new servers, do I have to re-run
    a. metricbeat/filebeat setup --template -E output.logstash.enabled=false -E .....
    b. metricbeat/filebeat setup --dashboards
  2. Do we have to create new indexes on ELK to handle the data from different servers?

Thanks

It's not necessary assuming it's the same Beat version.

No, I'd stick to the defaults and let each Filebeat write to the filebeat-<beat.version>-yyyy.mm.dd index and do the same for metricbeat.

As you begin to scale up you might want to adjust the number of shards used for the index depending on the amount data you index on a daily basis. How many shards should I have in my Elasticsearch cluster? | Elastic Blog

Thanks a lot

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.