Hey all, I'm attempting to combine take 2 logs and add a field from a previous log to the other log.
I've read around about aggregate and mutate, and all that but I'm not quite understanding how to do this.
These two things have an id in common, along with other log lines but I only want to add the name field from eventX, to the eventZ log line so it would all come together as:
You can do that with an aggregate filter. If the two lines are always in that order then do something like example 1.
If the order can vary then something similar to example 3. Note that only things you put into the map are added to the event that is created after the timeout is triggered.
Make sure pipeline.workers set to 1 and you may need to disable java_execution.
For reference, context is an inner json object, contains all the data from the above post.
So, {"field": "value", "context": { "name": "abcd", ...
In this case, I get class org.jruby.RubyHash cannot be cast to class org.jruby.RubyIO which I know what it means, but don't know why it's happening.
(EDIT: I've also tried json.name as well, to no avail in case that might've been an issue.
Your syntax to access your field is wrong: https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references
And just like you use event.set(…) to create/update field values with ruby, you need to use event.get(…) to read them.
And your map_action cannot be "create" if you want to work with an already existing map.
(And if you don't want to keep both the enriched eventZ and eventX, you'll need to cancel eventX)
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.