Hello,
I have a list data in Logstash that has been parsed as that looks like
"Attribute": [
"<Book><Name>N1</Name><Value>Clement</Value></Book>",
"<Book><Name>N2</Name><Value>Bundle Name</Value></Book>"
],
I want to loop over that list to generate the following data in ouput:
"book1":{
"key": "N1",
"value":"Clement"
},
"book2":{
"key": "N2",
"value":"Bundle Name"
}
Is it possible to do that?
Thanks for any help. I really appreciate it.
The xml filter requires the source field to be a string or a single-element array so it won't help you. I think your best bet is to write a small snippet of Ruby and put in a ruby filter. I don't know if there's a gem for parsing XML that you can use, though. Worst-case you'll have to use regexps. Obviously a bad option for parsing XML but if the form of the document is well-known it might be good enough.
Writing a custom filter plugin would of course work too.
I've tried the split plugin, but I found a bizarre behavior...
filter {
split {
field => "Attribute"
target => "book_bundle"
periodic_flush => "false"
}
}
Which will take my source event, clone it into N new events, where N is the number of members of the Attribute array. The array member will be put into the book_bundle field.
and then I added mutate with gsub option to transform book_bundle field and place it into a book1 .. bookN fields...
The problem here, when my logstash process is running, I find only the first element transformed like I want.. and then when I stop the process I find the rest of array's elements in the output file!
I can't understand this behavior.
Why are you disabling periodic_flush? The behavior you're describing sounds like what might happen when you disable periodic flushing.
It works with ES output.. Thank you very much.