Logstash - dump s3 into aws elasticsearch (vpc) - s3 is hanging forever


(Murali Pathivada) #1

I can transfer data from sql server to aws ES using logstash conf, however, i'm having issues with dumping s3 bucket into aws es.

error:
[INFO ] 2018-10-25 23:29:58.266 [[main]<s3] s3 - Using default generated file for the sincedb {:filename=>"/usr/share/logstash/data/plugins/inputs/s3/sincedb_13505270d3cbcd0658a878180decc403"}

my conf:

input {
s3 {
bucket => “samplebucket”
access_key_id => “xxxxxxxx”
secret_access_key => “xxxxxx”
region => "us-east-1"
}
}
output {

    amazon_es {
                    hosts => ["https://xxxxxxxxxxxxxxxxxxxxxxs-east-1.es.amazonaws.com"]
                    region => "us-east-1"
                    aws_access_key_id => ‘xxxxxxxxxxxxxx’
                    aws_secret_access_key => ‘xxxxxxxxxxxx’
                    index => "demo_s3"
            }

}


(Murali Pathivada) #2

Please ignore my message.

This is what I have discovered about s3:

  1. Given s3 conf file runs in the background (bin/logstash -f /etc/logstash/conf.d/mys3.conf) and accepts the files deposited in s3 bucket

  2. Logstash automatically identifies the new files deposited in s3 and process them in the background and dumps it into target, in this case, my AWS elasticsearch.

Problem resolved

Thanks


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.