Error on S3 plugin accessing ELB logs - Logstash version 5

I am getting the following error while trying to read ELB logs from an S3 bucket.
The instance running logstash is in same region a bucket and booted with a role that gives it full S3 access.

Error: can't convert nil into String
[2016-11-30T18:35:47,828][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin

Here is my conf file:

input {
s3 {
type => "elb"
bucket => "***-test"
region => "us-east-1"
# use_ssl => false
}
}

filter {
if [type] == "elb" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:loadbalancer} %{IP:client_ip}:%{NUMBER:client_port:int} %{IP:backend_ip}:%{NUMBER:backend_port:int} %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{NUMBER:elb_status_code:int} %{NUMBER:backend_status_code:int} %{NUMBER:received_bytes:int} %{NUMBER:sent_bytes:int} %{QS:request}" ]
}
date {
match => [ "timestamp", "ISO8601" ]
}
Add geolocalization attributes based on ip.
geoip {
source => "ip"
}
}
}

output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
# sniffing => true
manage_template => false
index => "elb-%{+YYYY.MM.dd}"
# document_type => "%{[@metadata][type]}"
}

When I try to set the use_ssl parameter, I also get an error:

[2016-11-30T18:14:44,310][ERROR][logstash.inputs.s3 ] Unknown setting 'use_ssl' for s3
... :reason=>"Something is wrong with your configuration."}

I was getting the same error until I added prefix => to the input:

input {
  s3 {
    access_key_id => ["****"]
    secret_access_key => ["****"]
    bucket => ["****-access-logs"]
    type => ["elbaccesslog"]
    prefix => ["acct_number_here.*"]
  }

EDIT: It turns out that I read the s3 input plugin doc too quickly, as regexp is not supported, so the above will not work. However, I would expect the below to work, and it does not.

input {
      s3 {
        access_key_id => ["****"]
        secret_access_key => ["****"]
        bucket => ["****-access-logs"]
        type => ["elbaccesslog"]
        prefix => ["/path/to/individual/elb/log/2016/11/1/"]
      }

Both configs resolved the errors in the logstash log but I'm still not getting any logs read from my ELB logging bucket.

I have tried that before as well, can you verify your path is right?

If I add this:
prefix => "AWSLogs/MYACCCOUNTNUMBER/elasticloadbalancing/us-east-1/"
I get the same error

but if I make a "mistake":
prefix => "AWSLogs/MYACCCOUNTNUMBER/elasticloadbalancing/us-west-1/"
then the error disappears, but I am guessing its because it doesn't find anything to process

Any luck otherwise?

No such luck. I just found the same thing you did -- when the path is accurate I get the can't convert nil to String message but when it doesn't match I get no error at all and, obviously, nothing in Elasticsearch.

I think I got it. Adding a few more fields to the configuration seems to get it working.

s3 {
        type => "elb"
        bucket => "****-test"
        region => "us-east-1"
        prefix => "AWSLogs/MYACCID/elasticloadbalancing/us-east-1/"
        interval => 20 # seconds
        sincedb_path => "/tmp/s3.sincedbappid"
        backup_add_prefix => "old/logstash-"
        backup_to_bucket => "jmfamily-test"
        delete => true
        codec => plain
}

I am pretty sure it is the sincedb_path though. I turned on debugging on the logstash.yml settings file and it pointed to that function in s3 rb plugin.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.