Problem with S3 as Input

Hi all,

I'm having the following configuration:

input {
s3 {
type => "cloudtrail"
bucket => $BUCKET_NAME
prefix => $PREFIX
access_key_id => $ACCESS_KEY_ID
secret_access_key => $SECRET
region => "eu-west-1"
interval => 60
codec => cloudtrail
}
}

filter {
}

output {
stdout { codec => rubydebug }
}

(Of course instead of the variables with capital letters I have actual parameters there)

I don't get any output.

I ran logstash like this:
bin/logstash -f logstash.conf -vvv
in order to see what happens.

I keep getting such messages:

S3 input: Found key {:key=> SOME .JSON.GZ FILE FROM THE BUCKET, :level=>:debug, :file=>"logstash/inputs/s3.rb", :line=>"111", :method=>"list_new_files"}
S3 input: Adding to objects[] {:key=> SOME .JSON.GZ FILE FROM THE BUCKET, :level=>:debug, :file=>"logstash/inputs/s3.rb", :line=>"116", :method=>"list_new_files"}

the key (the json.gz file) keeps changing as it appears in the bucket. Meaning, it does scan the bucket but nothing is being outputted (as it should be in stdout).

I'm running Logstash 2.1.1.

Any ideas?
I'll be glad for any help.

Thanks!

Nitz

At a guess it'd be probably this - https://www.elastic.co/guide/en/logstash/current/plugins-inputs-s3.html#plugins-inputs-s3-sincedb_path

Nitz, did you ever figure this out? I'm having the same problem.