Logstash filter ruby

Hi

How do I add a new field which is a percentage calculation using ruby filter ?

new field called 'MemValuePct' should contain the value of 'type_instance' in percentage

example

MemValuePct
[ free / (free +cached + buffered + used) * 100]

image

Thanks

Use the event.get method to fetch fields from the event and the event.set method to add a new field.

logs show errors after implementing the ruby filter. I am new to ruby , just wrote the filter based on examples

[2020-06-08T19:48:01,428][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 756985856.0:Float
[2020-06-08T19:48:01,434][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 2911322112.0:Float
[2020-06-08T19:48:07,621][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 223920128.0:Float
[2020-06-08T19:48:07,627][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 206065664.0:Float
[2020-06-08T19:48:07,634][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 166301696.0:Float
[2020-06-08T19:48:07,636][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 3544559616.0:Float
[2020-06-08T19:48:11,347][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `each' for 235282432.0:Float

here is my ruby filter

ruby {
code => '
    a = event.get("value")
    if a
        sum = 0
        a.each {
            sum += "value"
        }
        event.set("MemValuePct", [a / sum * 100])
    end
'
 }

Apparently [value] is a float, so it does not have a .each method.

still getting same errors.

converted to integer
a = event.get("value").to_i

then I reconstructed the filter suspecting array syntax is not proper. landed with the same errors

ruby {
 code => '
     a = event.get("[value]")
     if a
         sum = 0
         a.each_index { |x|
             sum += a[x]["value"]
         }
         event.set("valuepct", [a / sum * 100])
     end
 '
}

[value] is not an array, so you cannot iterate over it.

okay so how do I loop into the values , add them and calculate the percentage ?

These are seperate events with one integer value each, aren't they? So you don't even have access to the data you are trying to process unless you aggregate the events first, calculate the percentages and split them again afterwards.

yes they are separate events with integer values. I wrote the aggregate filter . Can someone validate it please ? eventually I need to get a percentage value for 'buffered' 'free' 'used' 'cached' in a separate field like 'valuepct'

 aggregate {
 task_id => "%{type_instance}"
 code => "
 map['sum'] ||= 0; map['sum'] += event.get('value').to_i;
 event.set('valuepct', [event.get('value').to_i/event.get('sum').to_i * 100])
 "
}

Errors

[2020-06-09T20:17:07,491][ERROR][logstash.filters.aggregate] Aggregate exception occurred {:error=>#<ZeroDivisionError: divided by 0>, :code=>"\n\t\t\tmap['sum'] ||= 0; map['sum'] += event.get('value').to_i;\n\t\t\tevent.set('valuepct', [event.get('value').to_i/event.get('sum').to_i * 100])\n\t\t\t", :map=>{"sum"=>9761632256}, :event_data=>{"@timestamp"=>2020-06-09T12:16:57.000Z, "type_instance"=>"free", "plugin"=>"memory", "host"=>"rhel5vm3", "@version"=>"1", "collectd_type"=>"memory", "type"=>"collectd", "value"=>3425472512.0}}
  1. If you use type_instance as the task id you will get 4 maps instead of one. The task id has to be a value that the events share.
  2. Your event doesn't have a field "sum", your map has.
  3. You cannot use the sum while processing the events because it isn't completed at that point. When the first event is beeing processed the sum is only this event's value, so the percentage is 100%. When the second is being processed, the sum is only the sum of these first two events. etc.

I think you will have to collect all the events in your map first, calculcate the sum and percentages and seperate them with a split filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.