Logstash s3 input error

I am using logstash input plugin to fetch logs from s3.

Below is my config.

input {
  s3 {
    type => "redshift-access-log"
    bucket => "xxxxxxxxxxxxx"
    prefix => "xxxxxxxxxxxxx"
    proxy_uri => "xxxxxxxxxxxxxx"
    region => "us-west-2"
    access_key_id => "xxxxxxxxxxxxx"
    secret_access_key => "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
  }
}
filter {
    if [type] == "redshift-access-log" {
        grok {
            match => { "message" => "\A'%{SYSLOGBASE2:timestamp}%{CRON_ACTION}%{NOTSPACE}%{CRON_ACTION}%{EMAILLOCALPART:db}%{CRON_ACTION}%{EMAILLOCALPART:user}%{CRON_ACTION}%{EMAILLOCALPART:pid}%{CRON_ACTION}%{EMAILLOCALPART:userid}%{CRON_ACTION}%{EMAILLOCALPART:xid}%{CRON_ACTION}%{NOTSPACE}%{SPACE}%{GREEDYDATA:sql}" }
        }
    }
}

output {
  elasticsearch {
    hosts => ["xx.xx.xx.xx"]
 }
 stdout { codec => rubydebug }
}

I have a latest version of logstash installed. When i start the elasticsearch it eventually fails up with this error below
Error: can't convert nil into String.

Can someone please help on this.

--
Niraj

What version? What is the complete error

@warkolm

All the versions i currently have.

 niraj@niraj-z820:~/testing$ dpkg -l | grep logstash
ii  logstash                                              1:5.0.2-1                                           all          An extensible logging pipeline
niraj@niraj-z820:~/testing$ dpkg -l | grep elasticsearch
ii  elasticsearch                                         5.0.0                                               all          Elasticsearch is a distributed RESTful search engine built for the cloud. Reference documentation can be found at https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html and the 'Elasticsearch: The Definitive Guide' book can be found at https://www.elastic.co/guide/en/elasticsearch/guide/current/index.html
niraj@niraj-z820:~/testing$ dpkg -l | grep kibana
ii  kibana                                                5.0.0                                               amd64        Explore and visualize your Elasticsearch data


[2016-12-07T11:25:39,624][INFO ][logstash.pipeline        ] Pipeline main started
[2016-12-07T11:25:39,648][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2016-12-07T11:25:40,638][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}
[2016-12-07T11:26:05,319][INFO ][logstash.inputs.s3       ] Registering s3 input {:bucket=>"xxxxxxxxxxxxxxxxxxxxxx", :region=>"us-west-2"}
[2016-12-07T11:26:05,847][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://xxxxxxx:9200"]}}
[2016-12-07T11:26:05,848][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2016-12-07T11:26:06,014][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2016-12-07T11:26:06,020][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["xxxxxxx:9200"]}
[2016-12-07T11:26:06,089][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2000}
[2016-12-07T11:26:06,094][INFO ][logstash.pipeline        ] Pipeline main started
[2016-12-07T11:26:06,133][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2016-12-07T11:26:08,813][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Plugin: <LogStash::Inputs::S3 type=>"redshift-access-log", bucket=>"xxxxxxxxxxxxxxxxxxxxxx", prefix=>"redshift-test-03", region=>"us-west-2", access_key_id=>"xxxxxxxxxx", secret_access_key=>"xxxxxxxxxxxxxx", id=>"3c6ecdbe72ba9f17ad486fea3c00ad2086dacaab-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_d4050914-df01-4fe4-b361-b191de784ebf", enable_metric=>true, charset=>"UTF-8">, delete=>false, interval=>60, temporary_directory=>"/tmp/logstash">
  Error: certificate verify failed

Error: certificate verify failed

But... that's a completely different error message. Which problem do you need help with?

Yeah that is what i am not able to understand is that when i use the s3 plugin why do i get a certificate error.

Are your SSL root certificates out of date? Is there a firewall or similar that breaks HTTPS?

Sorry seems like i replied too fast.

I fixed up the certificate thing. It was because of my proxy issue but i get an Error: can't convert nil into String. now when i start logstash.

Then you still haven't showed the complete error message and context for that error.

@magnusbaeck

Pasting the error below.

[2016-12-12T08:44:43,935][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["xxxxxxxxxxxx"]}
[2016-12-12T08:44:44,043][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2000}
[2016-12-12T08:44:44,053][INFO ][logstash.pipeline        ] Pipeline main started
[2016-12-12T08:44:44,143][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2016-12-12T08:44:47,438][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Plugin: <LogStash::Inputs::S3 type=>"redshift-access-log", bucket=>"xxxxxxxxxxxxx", prefix=>"redshift-test-03", proxy_uri=>"http://proxy.com:8080", region=>"us-west-2", access_key_id=>"xxxxxxxxxx", secret_access_key=>"xxxxxxxxxxx", id=>"fa580a93375f7244b11597f4537dbfeb5481a685-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_0927b42b-6ab2-4979-81ee-4bc89e43cbf4", enable_metric=>true, charset=>"UTF-8">, delete=>false, interval=>60, temporary_directory=>"/tmp/logstash">
  Error: can't convert nil into String
[2016-12-12T08:44:48,840][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x21e05593 URL:http://xxxxxxxxxx:9200>, :healthcheck_path=>"/"}
[2016-12-12T08:44:49,602][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Plugin: <LogStash::Inputs::S3 type=>"redshift-access-log", bucket=>"xxxxxxxx", prefix=>"redshift-test-03", proxy_uri=>"http://proxy.com:8080", region=>"us-west-2", access_key_id=>"xxxxxxxxxx", secret_access_key=>"xxxxxxxxxxxx", id=>"fa580a93375f7244b11597f4537dbfeb5481a685-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_0927b42b-6ab2-4979-81ee-4bc89e43cbf4", enable_metric=>true, charset=>"UTF-8">, delete=>false, interval=>60, temporary_directory=>"/tmp/logstash">

I had hoped the logs would include a stacktrace that pinpointed the exact spot the exception occurred. I don't know what's up here.

@magnusbaeck What do I have to do here to provide you the stacktrace. If you can tell me how then I can provide you the same.

I don't think there's a way (short of modifying the source code) to get Logstash to produce the stacktrace.

Here are the steps to reproduce the Error: can't convert nil into String error:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.