Hello Team,
Currently, we are working on building the monitoring platform for one of the AWS services. As part of that, I'm using the logstash-cloudwatch plugin to pull the logs from cloudwatch.
Logs in cloudwatch are region specific, using the cloudwatch plugin I think I can at a time connect to one region only. As you know AWS has 16 regions.
Now, What could be the best architecture to make this possible if I have loads of data(assuming at the enterprise level)? should I use different pipelines or should I use different logstash servers(on different EC2 Instances?)
or as an alternative, I can use AWS API to pull the logs from different regions and I can write them to a file (or files based on data size) so that I can give them as an input to logstash. ? but this is complicated as well.
What is the best way to implement this?
Please do suggest. Any help is appreciated.
I can understand, it depends on the data size and ec2 server capacity for logstash and all. but as of now even we are not able to judge it. We are trying to find the best way to implement this. Please help
Regards
Rahul Nama