Logstash with Cloudtrail

Hi,

I have configure a logstash to pull data from cloudtrail data from amazon s3. I am running it as command line on Centos 7.

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf

Below is my logstash config.

input {
  s3 {
type => "cloudtrail-dev"
interval => 60
bucket => "auditing-all"
prefix => "AWSLogs/42188xxxxxx/"
backup_to_dir => "/etc/s3backup/"
proxy_uri => "http://web-proxy.example.com:8080"
codec => cloudtrail {}
region => "us-east-1"
access_key_id => "AKIAIIxxxxxxxxxxxxxxx"
secret_access_key => "etNTajHrxxxxxxxxxxxxxxxxxxxxxxx"
  }
}
filter {
  if [type] == "cloudtrail-dev" {
mutate {
  gsub => [ "eventSource", "\.amazonaws\.com$", "" ]
  add_field => {
    "document_id" => "%{eventID}"
  }
}
if ! [ingest_time] {
  ruby {
    code => "event['ingest_time'] = Time.now.utc.strftime '%FT%TZ'"
  }
}


if [eventSource] == "elasticloadbalancing" and [eventName] == "describeInstanceHealth" and [userIdentity.userName] == "secret_username" {
  drop {}
}
  }
}

output {
  stdout {codec => rubydebug}
  elasticsearch {
  hosts => ["example.com:9200"]
  codec => json
}
}

I see logstash been starting properly and pulling something from s3 like below. But i do not see any index been created on my elasticsearch.

08:59:16.512 [[main]<s3] DEBUG logstash.inputs.s3 - S3 input: Adding to objects[] {:key=>"AWSLogs/xxxxxxxxxxxxxx/CloudTrail/eu-central-1/2015/04/14/xxxxxxxxxxxxxx_CloudTrail_eu-central-1_20150414T0745Z_QJw480dF0ORUe3ID.json.gz"}
08:59:16.512 [[main]<s3] DEBUG logstash.inputs.s3 - objects[] length is:  {:length=>470997}
08:59:16.512 [[main]<s3] DEBUG logstash.inputs.s3 - S3 input: Found key {:key=>"AWSLogs/xxxxxxxxxxxxxx/CloudTrail/eu-central-1/2015/04/14/xxxxxxxxxxxxxx_CloudTrail_eu-central-1_20150414T0755Z_nSWdrZE9X40scBYa.json.gz"}
08:59:16.513 [[main]<s3] DEBUG logstash.inputs.s3 - S3 input: Adding to objects[] {:key=>"AWSLogs/xxxxxxxxxxxxxx/CloudTrail/eu-central-1/2015/04/14/xxxxxxxxxxxxxx_CloudTrail_eu-central-1_20150414T0755Z_nSWdrZE9X40scBYa.json.gz"}
08:59:16.513 [[main]<s3] DEBUG logstash.inputs.s3 - objects[] length is:  {:length=>470998}
08:59:16.513 [[main]<s3] DEBUG logstash.inputs.s3 - S3 input: Found key {:key=>"AWSLogs/xxxxxxxxxxxxxx/CloudTrail/eu-central-1/2015/04/14/xxxxxxxxxxxxxx_CloudTrail_eu-central-1_20150414T0805Z_rc73kdFvCTNkeaJV.json.gz"}
08:59:16.513 [[main]<s3] DEBUG logstash.inputs.s3 - S3 input: Adding to objects[] {:key=>"AWSLogs/xxxxxxxxxxxxxx/CloudTrail/eu-central-1/2015/04/14/xxxxxxxxxxxxxx_CloudTrail_eu-central-1_20150414T0805Z_rc73kdFvCTNkeaJV.json.gz"}
08:59:16.513 [[main]<s3] DEBUG logstash.inputs.s3 - objects[] length is:  {:length=>470999}
08:59:18.359 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:494] DEBUG logstash.pipeline - Pushing flush onto pipeline
08:59:23.374 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:494] DEBUG logstash.pipeline - Pushing flush onto pipeline
08:59:30.493 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:494] DEBUG logstash.pipeline - Pushing flush onto pipeline

Am i doing something wrong here or it will take some time to ingest the data.

--
Niraj

You have stdout setup as an output. Do you see anything there? Or is it just not getting into elasticsearch?

If you are not getting any stdout messages also then I would use pipelines stats.

In particular look at the in and out values for all of your filters. It's possible that your drop{} statement is deleting every event and so it never actually gets to the output. You would see something like "in":123456 and "out":0.
Meaning that 123,456 events came into the filter, but 0 went out of it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.