How to pass array field to kv-filter

Hi all,

i'm looking for a way within logstash to pass an array to the kv-filter and use it as parameter "include_fields".
For example in my json event i have an array named "keys":
"keys" => [
[0] "key1",
[1] "key2",
[2] "key3",
What I've tried so far is
kv {
include_keys => [ "%{keys}" ]
But this doesn't work, only if i use
kv {
include_keys => [ %{[keys][0]}", "%{[keys][1]}" ]
it works, but i want the to be dynamic so whatever is in [keys] should be used in kv-filter as include_keys. Any ideas?


I don't think there's a way of doing what you want with include_keys. How about capturing all fields at first and removing them afterwards? The prune filter would be ideal for that, except that it, too, doesn't support arrays (but that's fixable—feel free to file an issue). However, a ruby filter would do the job. I'm pretty sure examples have been posted here in the past, and definitely on StackOverflow (see e.g.

Hi Magnus,

thanks for your reply, but the problem is, that i don't know about the keys to be included. So other people configure keys which should used as kv-filter's include_keys.
That's why i'm looking for a way to configure kv-filter's include_keys in a dynamic way with values which are send to the logstash pipeline together with event data.


The StackOverflow link I posted shows how to delete all fields except those in a particular set, which would allow you to extract all fields with a kv filter but delete those you're not interested in. I believe this would solve your problem. If not, please explain why.

Sorry i'll try to explain:
As i unterstand your ruby example (i'm not familiar) it shall remove all unwanted fields from an event and than you suggest do a kv-filter on the rest. But i don't want to remove any data from an event.

If an event (logline) is send to the logstash pipeline which includes an additional field "keys" the pipeline should use this field as input for kv-filter.
The contents of the field "keys" should be used to find a key-value match in the logline.
So we want to enrich our data with new fields based on key-value informations in the different loglines.

As i unterstand your ruby example (i'm not familiar) it shall remove all unwanted fields from an event and than you suggest do a kv-filter on the rest.

No! You run the kv filter to extract all fields from the input string, then use a Ruby filter to remove the fields that you don't want, based on the contents of the keys field (delete the extracted fields that are not included in `keys'). In the end you'll get exactly what you want except that it needs to be done with two or three filters instead of one.

Ok, got it...

wanted_fields = "%{keys}" or should it be wanted_fields= event.keys?

wanted_fields = event['keys']

OK tested it, but as i assumed all other content is removed from the event except the fields which are in keys.
If I would add a tag for each field added by kv-filter, is there a way to reference the tag in ruby?
If yes i could only to remove fields from an event which are added by kv-filter (has tag e.g. kvf) and not in keys...

Fields can't have tags. I suggest you set the kv filter's target option to store those keys in a subfield. Then you know that all fields in there come from kv. Once you've purged the fields you don't want, move the rest to the top level. You can do that in the same ruby filter.

Can you give me an example how to access subfields in ruby?
Unfortunatly i don't get it working...

event['field']['subfield'] doesn't work?

I've tried it like this (kv filter's target=keyval)
ruby {
code => "
wanted_fields = event['keys']
event['keyval'].to_hash.keys.each { |k|
event['keyval'].remove(k) unless wanted_fields.include? k
But i get an error: Ruby exception occurred: undefined method `remove' for ...

Yeah, this is a bit awkward. event isn't really a Ruby hash object and the method for deleting fields is called remove. However, nested fields are implemented using regular hashes whose corresponding method is delete. So event.remove('foo') but event['foo'].delete('bar').