My goal is to send all of the logs from all the remote servers to a S3 bucket. I decided to go with Logstash S3 output plugin since we're already running a ELK setup. So each remote server is sending the log files to the master ELK instance. However, I edited the filebeat.conf to include loadbalance and ship logs to it's local logstash. Just so logstash will see all of the logs and then the S3 Output plugin will transfer those logs to the S3 bucket.
However, I'm unable to see any logs on the bucket. Please advise.
filebeat.conf:
output.logstash: # The Logstash hosts hosts: ["xx.xx.xx.xx:5044", "localhost:5045"] loadbalace: true index: test
logstash/conf.d/output.conf:
output {
s3{
region => "us-east-1"
bucket => "elk-logs-testing"
canned_acl => "private"
time_file => "1"
codec => "json_lines"
}
}
** logstash/conf.d/input.conf:**
input { beats { port => 5044 } } filter{ json{ source => "message" } }
My overall goal is to keep sending the logs to the master ELK instance but also send to the S3 bucket..