Using S3 as input plugin


(Niraj Kumar) #1

Hi,

I am using s3 as the input plugin to download the logs in s3 bucket. The logs in my s3 bucket is organized in folders with a .tar.gz file in each of them. Can the S3 plugin help me achieve this. The .tar.gz file contains a JSON file.

I have used the following but somehow i don't seem to get anything in kibana.

input {
s3 {
bucket => "dev-metrics"
access_key_id => "xxxxxxxxx"
secret_access_key => "xxxxxxxxxxxxxxx"
region => "us-east-1"
sincedb_path => "./last-s3-file"
codec => "json"
}
}
output {
stdout {}
elasticsearch_http {
host => "localhost"
port => "9200"
}
}

Can anyone help me achieve this.


(Magnus B├Ąck) #2

The S3 input supports gzipped plain files but not tarballs. You'll have to write a separate script that pulls the tarballs and unpacks them into a local directory that you can have Logstash monitor.


(system) #3