I am just starting with logstash; I have a use case where bunch of logs(each line is a json) are persisted to S3 by an external service. I have bunch of filters that represents some field values in this log(e.g. key1=val1 AND key2=val2, key1=val3, ...). I did some initial analysis of using logstash, I am not sure if it fits my use case.. but I wanted to give it a try and see it out. I am having trouble to determine how can I provide multiple filters to match these log lines such that I want to emit cloudWatch metric for each matched combination... e.g. is it possible that single log line be published as 2 different cloudWatch metrics with different set of dimensions?
It seems like logstash might be a difficult fit for my use case, but before I gave up wanted to check with experts
Sample code
input {
s3 {
"access_key_id" => "my-access-id"
"secret_access_key" => "secret"
"bucket" => "logstash-test"
}
}
filter {
grok {
//filters from an external file preferably like s3... filters could be key1=val1 and key2=val2, another filter would be key1=val2
add_field => ["CW_metricname", "%{metric_name}"]
add_field => ["CW_unit", "Count"]
add_field => ["CW_value", "%{metric_value}"]
add_field => ["CW_dimensions", "%{key1}", "CW_dimensions", "%{val1}"]
}
}
output {
cloudwatch {
access_key_id => "my-access-id"
secret_access_key => "secret"
region => "eu-west-1"
}
}