I wrote a plugin that people here might find useful. Given a CSV/JSON/YML file with structured data, add fields to your event based on a lookup into that file.
The use-case I had for it was to do geocoding based on account number. I created a CSV file that had accountnumber,lat,lon as fields and then used the plugin to lookup the account number and add a [location] element to my events.
The filter looks at the file modified time and merges in any changes when the file changes, so it can be use to decouple a database lookup -- ie extract your data from the database and then put the file on the logstash server.
Translate can only add 1 field to an event. Augment can add multiple fields at once. The use case I wrote it for was to add lat/lon based on an account number.
the other use case I see is combining data from an external source. For example you have a username in your logs, but you want to add details on their name, business unit, groups, etc.
You can periodically extract a full dump from your source of truth to a CSV file and then augment your events with that information.
I don't see how. For a csv file it uses the first field as the dictionary key and the second field as the value. You'd have to have a csv file for each attribute you wanted to augment onto the event.
Oh, you gave another example of where one would want to augment an event with multiple fields. That should still be possible with a translate + json combo.
Hi, i'm a bit stumped, i want to implement this use case and i can not figure out how to handle lon/lat with translate+json combo. I dont have much experience with ELK.
If you have some time, would you be able to post 1 example of this configuration?
Thanks in advance!
(ps:Decided to ask here 'cos google got me here while searching for the subject, i bet others ended up in this thread as well)
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.