S3 dynamic folder naming in yyyy/mm/dd

Someone on another website proposed a mechanism for dynamically creating folders in the YYYY/mm/dd for the s3 output plugin. Effectively changing the function "write_on_bucket" to the following starting few lines. This was suggested from the following website:

def write_on_bucket(file)
# find and use the bucket
bucket = @s3.buckets[@bucket]
t = Time.new
date_s3 = t.strftime("%Y/%m/%d/")
remote_filename = "#{@prefix}#{date_s3}#{File.basename(file)}"

This worked! It created the subfolders provided a prefix was provided as suggested on the website. So what is the problem? Well two problems actually:

  1. The logstash-programmatic-access-test-object* will not delete itself
  2. when specifying size_file in the output plugin, it will increment up with each iteration; even if singularly the file size is small

Does anyone know what might be going on?

Regarding #1, it was necessary to change remote_filename on the delete function as well, so this is no longer an issue. Still tracking the size file incrementing