If I understood correctly you are trying to populate a field based on the content of another field?
One way to do that in logstash is using the translate
filter where you would have a dictionary with key-value pairs.
The keys in this dictionary would be the value of the first field, and the value of this dictionary would be the one that you want to populate into the new field.
So, based in your example, you would need a dictionary like this in an external file.
"biuser@NPRD.BIGD.BE": "Data & Analytics"
"hive/nprd.bigd.BE": "Marketing Automation"
"hue/el755.nprd.bigd.be": "Analytics Boost"
"impala.nprd.bigd.BE": "Business insights"
"bhdg@PROD.BE": "Product & network Intelligence"
Then you would need this translate filter in your pipeline
translate {
field => "user"
destination => "[dataconsumption][doc_lob]"
dictionary_path => "/path/to/the/dictionary/file.yml"
refresh_interval => 300
fallback => "Others"
}
What this filter do is check if the value of the user
field exists as a key in the dictionary file, if it exists, it will get the value for that key and set it as the value of the field [dataconsumption][doc_lob]
, if it does not exists, it will set the value of the field [dataconsumption][doc_lob]
as Others
, because the fallback
option is set with this value.
For example, if the user
field has the value impala.nprd.bigd.BE
, after the event pass through the translate filter, it will have the field [dataconsumption][doc_lob]
with the value Business insights
.
You will need to build a process to update the dictionary file, the refresh_interval
in the filter will tell logstash to check for changes in the file after that interval has passed.
There are other filters that can be used to enrich your data like the jdbc_static and the memcached, but the translate filter is the easiest one to use.