Cloudtrail import on ELK

Hi,

This use case is classic I think but I try to import cloudtrail logs from a S3 bucket into a ELK stack but I have this problem : Each logfile is a bunch of events and the "Records" field isn't unserialized and the events not analyzed

I want to use the cloudtrail codec but doesn't seem available and I can't achieve to build it successful, if it's not included on the package maybe there is an another codec for made the job. I tried the codecs "json" and "json_lines"

My config:

input
{
s3 {
bucket => "bucket_name"
type => "s3_cloudtrail"
role_arn => "arn_role"
region => "eu-west-3"
interval => 120

codec => "json_lines"

	 codec => "json"
} 

}

filter {
json {
source => "Records"
}
mutate {
gsub => [ "eventSource", ".amazonaws.com$", "" ]
add_field => {
"document_id" => "%{eventID}"
}
}
}

output
{

elasticsearch
{
    hosts => ["127.0.0.1:9200"]

index => "%{[type]}-%{+YYYY.MM.dd}"

index => "s3_cloudtrail-%{+YYYY.MM.dd}"
}

stdout { codec => rubydebug }

}

  • versions :
    root@ip-172-31-15-251:/opt/bitnami# ./logstash/bin/logstash --version
    logstash 6.3.0
    root@ip-172-31-15-251:/opt/bitnami# ./elasticsearch/bin/elasticsearch --version
    Version: 6.3.0, Build: oss/tar/424e937/2018-06-11T23:38:03.357887Z, JVM: 1.8.0_161
    root@ip-172-31-15-251:/opt/bitnami# ./kibana/bin/kibana --version
    6.3.0

thanks for any guidance, help

Up please

finally I have compiled the codec "cloudtrail" and now it's fine, the json records are correctly decoded