I have the following filter in Logstash that parses AWS ELB access logs:
filter {
grok {
match => [ "message", '%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:loadbalancer} %{IP:client_ip}:%{NUMBER:client_port:int} (?:%{IP:backend_ip}:%{NUMBER:backend_port:int}|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} (?:%{NUMBER:elb_status_code:int}|-) (?:%{NUMBER:backend_status_code:int}|-) %{NUMBER:received_bytes:int} %{NUMBER:sent_bytes:int} "(?:%{WORD:verb}|-) (?:%{GREEDYDATA:request}|-) (?:HTTP/%{NUMBER:httpversion}|-( )?)" "%{DATA:userAgent}"( %{NOTSPACE:ssl_cipher} %{NOTSPACE:ssl_protocol})?' ]
}
}
which results in various fields in Elasticsearch, one being the request
filed with a possible value of
https://api.example.net:443/v2/domain.com/actions?somefield=somevalue
Is there a way to add a second grok filter to operate on that field, before it gets indexed to ES and have different conditions based on the domain.com
field ? For example if the domain.com
string exists add it to a domain_name
field in Elasticsearch etc.