Logstash S3 Output Plugin not working

My goal is to send all of the logs from all the remote servers to a S3 bucket. I decided to go with Logstash S3 output plugin since we're already running a ELK setup. So each remote server is sending the log files to the master ELK instance. However, I edited the filebeat.conf to include loadbalance and ship logs to it's local logstash. Just so logstash will see all of the logs and then the S3 Output plugin will transfer those logs to the S3 bucket.

However, I'm unable to see any logs on the bucket. Please advise.


# The Logstash hosts
hosts: ["xx.xx.xx.xx:5044", "localhost:5045"]
loadbalace: true
index: test


output {
  region => "us-east-1"
  bucket => "elk-logs-testing"
  canned_acl => "private"
  time_file => "1"
  codec => "json_lines"

** logstash/conf.d/input.conf:**

input {
  beats {
    port => 5044
        source => "message"

My overall goal is to keep sending the logs to the master ELK instance but also send to the S3 bucket..

load balancing in filebeat is not what you want. I would expect half the events to go to the local logstash and half to the remote logstash.

If you want to write them to s3 locally, then just write to the local logstash, have that write them to s3 and also forward to the master logstash using http, or tcp, or any one of a number of other input / output pairs.

Thanks for your reply!

If I want to go with the suggestion you just provided, how do I go by doing that? I'll remove the loadbalance from filebeat.conf but not sure on what else needs to be done.

In your filebeat output, remove "xx.xx.xx.xx:5044", so that you write everything to localhost. In the logstash instance running on localhost, write to s3, and add a lumberjack output that sends events to "xx.xx.xx.xx:5500". On the remote server configure a lumberjack input that listens on port 5500

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.