Logstash output to file and s3 is different

I'm trying to build something that would put logs into s3.

In this case i'm working with netflow and my first step was to just get something out there.

input {
  udp {
    port  => 9995
    codec => netflow
  }
}

output {
   s3{
     access_key_id => "REMOVED"
     secret_access_key => "REMOVED"
     bucket => "REMOVED"
   }
   file {
     path => "/var/log/logstash/test.log"
 }
}

The file output is showing the flows decoded. however the s3 output shows a timestamp, an origin IP and %message, it's allso chunking them up into 5MB files.

Any direction would be appreciated.

The default codec for a file output is json_lines. The default codec for an s3 output is line. You should set the codec on the s3 output.

Hi,
Thank you for the reply. This got me a little further down the road. I really appreciate it.

Just for anyone else who is interested. The output that worked is

output {
   s3{
     access_key_id => "Removed"
     secret_access_key => "Removed"
     bucket => "Removed"
     codec =>  "json_lines"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.