Hello,
we use in a logstash pipeline an external ruby script which contains several methods to process logfiles.
now we want to add a new method to rename all top level fields of a document based on a hashtable.
I have adapted a method that was used to filter nested fields.
def filter(event)
transformKnownFields(event)
return [event]
end
def transformKnownFields(event)
events = event.get()
for k,v in events do
fieldname = k.downcase
if fieldname == "elapsed"
v = v*1000
end
ecsfield = $ecsmappings[fieldname]
event.set(ecsfield, v)
event.remove(k)
end
end
$ecsmappings = {
"action" => "[event][action]",
"actionname" => "[event][action]",
"controller" => "[event][category]",
"controllername" => "[event][category]",
"correlationid" => "[trace][id]",
"culture" => "[client][geo][region_iso_code]",
"duration" => "[event][duration]",
"elapsed" => "[event][duration]",
"elapsedmilliseconds" => "[event][duration]",
"exception" => "[error][type]",
"exceptionmessage" => "[error][message]",
"host" => "[url][domain]",
"httpmethod" => "[http][request][method]",
"level" => "[log][level]",
"machinename" => "[host][name]"
}
Unfortunately it doesn't work like that. i also tried to use "event.to.hash" but i had no luck.
i'm not very familiar with ruby and i'm not really getting anywhere for days now. can someone give me please some advise
Hy Badger,
thanks for the quick reply. unfortunately the code doesn't work when i put it in our external ruby script that is called from within logstash pipeline.
i assume that this example only works directly in the pipeline or?
is it not possible to use the event.to_hash function in external ruby scripts?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.