MULTIPLE FILES IN S3 as INPUT AND CREATE MULITPLE INDEXES IN S3

HI Everyone,

My requirement is I have an s3 bucket - it will have a hierarachy
FOLDERNAME1
--> filename1
FOLDERNAME2
--> Filename2
and so on

So logstash should
make the filename as an index in elasticsearch..
This is my current configeration file, it works if I have just the filenames is s3 (direct) not under any folder. I the folder is important. Any idea how I could do this ?

input {
s3 {
bucket => ""
access_key_id => "
"
secret_access_key => "*"
region => "us-east-1"
codec => "json"
}
}

filter{
mutate{
add_field =>{
"file" => "%{[@metadata][s3][key]}"
}
}
}
output {
amazon_es {
hosts => *
region => "us-east-1"
aws_access_key_id => ''
aws_secret_access_key => '
'
index => "%{file}"
template_name => "sqe_template"
template_overwrite => "true"
codec => "json"
}

If I have a filename sqe-2019-01-01 index is 2019-01-01

how do I get this once I have the files inside their respective folder

Thank You,
Mohit Ruke

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.