Hello,
I want to match my data with a set of codes. I tried to do that using translate plugin, having a file with those matches. But, it seems a bad option because I have various matches with lead me to have many auxiliary files. There is any other way to do this?
If you are comfortable using the ruby filter you can leverage ruby case statements. Something we do to translate code_values to display_values in our pipeline.
Example from our external ruby file:
def filter(event)
case event.get('field1')
when 0 then event.set('shifts', 'Yes')
when 1 then event.set('shifts', 'No')
end
case event.get('field2')
when 0 then event.set('view_access', 'Internal')
when 1 then event.set('view_access', 'Public')
end
case event.get('field3')
when 0 then event.set('locked', 'Yes')
when 1 then event.set('locked', 'No')
end
case event.get('field4')
when 0 then event.set('action_status', 'Assigned')
when 1 then event.set('action_status', 'In Progress')
when 2 then event.set('action_status', 'Completed')
end
case event.get('field5')
when 0 then event.set('assign_work_detail', 'Yes')
when 1 then event.set('assign_work_detail', 'No')
end
Thank you. However when I have a large quantity of combinations, is this type of conditions more efficient than having translate filter? Which one is the better option?
Honestly I do not know but we have not noticed any latency issues. The translate option for us would produce a pretty nasty pipeline given some of our ruby filters have 20+ fields that get converted. I enjoy the manageability of the ruby (1 file per pipeline).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.