I am sending logs from filebeat to logstash and in logstash I am doing filter based on regex and storing it in elastic search
My logstash config:
input {
beats {
port => 5002
}
}
filter
{
grok {
add_tag => [ "first" ]
match => { "message" => ""..definitionName": "(?<build_DefinitionName>.?)"" }
}
grok {
add_tag => [ "first" ]
match => { "message" => ""..requestedFor": "(?<build_RequesterName>.?)"" }
}
grok {
add_tag => [ "first" ]
match => { "message" => ""..buildNumber": "(?<build_BuildNumber>.?)"" }
}
grok {
add_tag => [ "first" ]
match => { "message" => ""..teamProject": "(?<build_TeamProject>.?)"" }
}
grok {
add_tag => [ "first" ]
match => { "message" => "(?<build_ErrorMessage> ERR .*)" }
}
grok {
add_tag => [ "end" ]
match => { "message" => "Current state: job state = '(?<build_IsFailed>.+)'" }
}
if "end" in [tags]{
aggregate {
task_id => "%{source}"
code => "event.set('build_DefinitionName', '%{build_DefinitionName}')
event.set('key', 'value')
map['build_DefinitionName'] ||= event.get('build_DefinitionName')
event.set('build_BuildNumber', map['build_BuildNumber'])
event.set('build_RequesterName', map['build_RequesterName'])
event.set('build_TeamProject', map['build_TeamProject'])
event.set('build_IsFailed', map['build_IsFailed'])
event.set('build_ErrorMessage', map['build_ErrorMessage'])"
map_action => "create_or_update"
push_previous_map_as_event => true
end_of_task => false
}
}
if "first" not in [tags] and "end" not in [tags] {
drop { }
}
mutate {
remove_field => [ "message" ]
}
mutate {
remove_tag => [ "first" ]
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "mylog-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
Since we are sending data line by line from filebeat. Each of the above fields been stored as separate event so at the end we are trying to aggregate but there is no value against the fields.
For example:
build_DefinitionName filed available in the last event but no value: