WAF logs Configuration from s3 bucket to Elasticsearch Using logstash s3 Plugin

Hello Everyone

I am trying to Configuring WAF 2.0 logs from s3 Bucket to elasticsearch but I am facing some erroorm please Helpme for the same.

My Current Logstash configuration

input {
s3 {
access_key_id => "myid"
secret_access_key => "mykey"
region => "eu-west-1"
bucket => "aws-waf-logs-waf2.0"
prefix => "AWSLogs/1234564897/WAFLogs/cloudfront/MY-WAF/2022/////1234564897_waflogs_cloudfront_MY-WAF*"
type => "waf-log"
interval => "10"
sincedb_path => "/tmp/.waf-log_since.db"
}
}
filter {
if [type] == "waf-log" {
json {
source => "message"
}
date {
match => [ "[timestamp]", "UNIX_MS" ]
}
geoip {
source => [ "[httpRequest][clientIp]" ]
target => geoip
}
ruby {
code => '
event.get("[httpRequest][headers]").each { |kv|
event.set(name = kv["name"], value = kv["value"])}
'
}
}
}
output {
elasticsearch {
user => "myuser"
password => "mypass"
ssl => "true"
ssl_certificate_verification => "false"
hosts => ["https://xx.xx.xx.xx:9200"]
index => "waf-logs-%{+YYYY.MM.dd}"
ilm_enabled => false
}
}

Current Error Which I am Facing

kindly Help ASAP I m Stuck in this situation.

Your error log is pretty clear, you have some error with your credentials to access the s3 bucket, you need to check if it is everything correct.

@leandrojmp
Thank you for Reply

Now In this Logs I am not able to coordinate map with geo_point feild can any suggest how I can Achive this

and WAF logs size is too much can we decrease size in elasticsearch

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.