Relatively new to elastic and logstash, but trying to get it to where cloudtrail stored in S3 can be sucked into Elastic and I'm trying to use the logstash-codec-cloudtrail to accomplish this. The plugin appears to work, but eventually throws the below error. Any suggestions or thoughts are appreciated. Debug log shows a lot of files being processed from S3, but no new index is being created in Elastic.
[2017-07-11T22:19:24,684][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::S3 bucket=>"cloudtrail-test", region=>"us-east-1", prefix=>"AWSLogs/1234567890/CloudTrail/", sincedb_path=>"/var/lib/logstash/sincedb-1234567890", delete=>false, interval=>60, type=>"cloudtrail", codec=><LogStash::Codecs::CloudTrail id=>"cloudtrail_492091d5-d7a1-486d-a8e2-0dd475051842", enable_metric=>true, charset=>"UTF-8">, id=>"9121c9ac9c16e09bcf3e5e86a8fada2de8a25539-1", enable_metric=>true, temporary_directory=>"/tmp/logstash">
Error: undefined method `has_key?' for nil:NilClass
Exception: NoMethodError
Stack: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-cloudtrail-3.0.2/lib/logstash/codecs/cloudtrail.rb:25:in `decode'
org/jruby/RubyArray.java:1613:in `each'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-cloudtrail-3.0.2/lib/logstash/codecs/cloudtrail.rb:21:in `decode'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:174:in `process_local_log'
org/jruby/RubyProc.java:281:in `call'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:254:in `read_gzip_file'
org/jruby/ext/zlib/JZlibRubyGzipReader.java:648:in `each'
org/jruby/ext/zlib/JZlibRubyGzipReader.java:659:in `each_line'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:254:in `read_gzip_file'
org/jruby/RubyIO.java:1201:in `open'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:251:in `read_gzip_file'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:236:in `read_file'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:168:in `process_local_log'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:311:in `process_log'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:142:in `process_files'
org/jruby/RubyArray.java:1613:in `each'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:137:in `process_files'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:93:in `run'
org/jruby/RubyProc.java:281:in `call'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/interval.rb:20:in `interval'
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-s3-3.1.5/lib/logstash/inputs/s3.rb:92:in `run'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:456:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:449:in `start_input'
The logstash.conf file looks like so:
input {
# bucket for testing
s3 {
bucket => "cloudtrail-test"
region => "us-east-1"
prefix => "AWSLogs/1234567890/CloudTrail/"
sincedb_path => "/var/lib/logstash/sincedb-1234567890"
delete => false
interval => 60 # seconds
type => "cloudtrail"
codec => cloudtrail
}
}
filter {
if [recipientAccountId] == "1234567890" {
mutate { add_field => { "pod" => "test" } }
}
}
output {
elasticsearch {
hosts => [localhost]
index => "cloudtrail4-%{+YYYY.MM}"
document_type => "cloudtrail"
}
}
logstash version:
/usr/share/logstash/bin/logstash --version
logstash 5.5.0
Thank you for any available assistance!