I am trying to add a new field to concatenate strings from my log as follow,
"message" => "#@enterprise=[1.3.6.1.4.1.9.9.187], @timestamp=#@value=2612151602>, @varbind_list=..., @specific_trap=7, @source_ip=\"1.2.3.4\", @agent_addr=#@value=\"\xC0\xA8\v\e\">, @generic_trap=6>"
i want to retrive @enterprise value and @specific_trap value , and add a new field for the value
for the log event , after filter it's will be looks like
result => 1.3.6.1.4.1.9.9.187.0.7
If anyone knows how to do with it , please help. Thanks.
filter {
mutate {
add_field => {
"result" => "%{[@enterprise_value}][0]}.0%{@specific_trap}"
}
}
}
https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html
Oh, right. Forgot about that part. You should be able to use the kv filter but grok also works. Remember that square brackets need to be escaped in regular expressions.
No... you're misunderstanding the point of the kv filter and your use of add_field
above doesn't make sense. The kv filter accepts a string on the form
a=b c=d
and turns it into the following fields:
a => b
c => d
You may have to tweak its settings a bit to work with your input.
As you said , use mutate to connect string , but how to filter the log to get that i want ?
Are you mean the settings of KV filter in order to defined the values to mutate ?
Yes. The kv filter can split the string into keys and values as in my previous example. Then you can combine some of those values using a mutate filter. I don't know how else to explain this. How about you try it out?
Wouldn't ", " (a comma and a space) be a better field splitter?
Next time please include the whole event and not just the result
field.
Try excluding the @timestamp
field using the exclude_keys
option. Alternatively, inverse the logic by using include_keys
to choose which keys that should be extracted.