Someone on another website proposed a mechanism for dynamically creating folders in the YYYY/mm/dd for the s3 output plugin. Effectively changing the function "write_on_bucket" to the following starting few lines. This was suggested from the following website:
def write_on_bucket(file)
# find and use the bucket
bucket = @s3.buckets[@bucket]
t = Time.new
date_s3 = t.strftime("%Y/%m/%d/")
remote_filename = "#{@prefix}#{date_s3}#{File.basename(file)}"
This worked! It created the subfolders provided a prefix was provided as suggested on the website. So what is the problem? Well two problems actually:
- The logstash-programmatic-access-test-object* will not delete itself
- when specifying size_file in the output plugin, it will increment up with each iteration; even if singularly the file size is small
Does anyone know what might be going on?