Kibana didn't load the data when using ruby filter in logstash

I wrote ruby code in logstash conf file like this
mutate {
add_field => {"[location]" => "[0,0]"}
}
ruby{
code => 'event.set("[location]", [(event.get("%{[lt][py]}").to_f * 5.6) / 10), event.get("[lt][px]"])) '
}

what i found in the logstash log file is that:
[2017-07-24T12:17:46,538][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@127.0.0.1:9200/, :path=>"/"}
[2017-07-24T12:17:46,638][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#Java::JavaNet::URI:0x3dff0f9b}
[2017-07-24T12:17:46,639][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-07-24T12:17:46,666][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-07-24T12:17:46,671][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#Java::JavaNet::URI:0x4907425a]}

I couldn't find data in kibana. After i removed the ruby code logstash worked fine.
Does anyone know how to deal with that ?

Thanks,
Muhammad Abbas

Any suggestions for the above !

You have too many brackets etc.

This should work:

mutate {
  # set [location] as a path to an Array.
  add_field => {"[location]" => [0, 0]}  
}
ruby {
  # overwrite previous array, path [location]
  code => 'event.set("[location]", [0.56 * event.get("[lt][py]").to_f, event.get("[lt][px]")])'
}

This almost works but does not :frowning: :

mutate {
  add_field => {"[location]" => [], "[location][1]" => "%{[lt][px]}"}
}
ruby{
  code => 'event.set("[location][0]", 0.56 * event.get("[lt][py]").to_f)'
}
# setting an field in 'add_field' uses sprintf and creates a String in "[location][1]".
# Event: {"@timestamp"=>2017-07-24T13:44:29.892Z, "@version"=>"1", "lt"=>{"px"=>11, "py"=>560}, "location"=>[313.6, "11"]}

To experiment on your dev box/laptop...
You can do this kind of thing:

$ bin/bundle console
Resolving dependencies...................
irb: warn: can't alias context from irb_context.
irb(main):001:0> require 'logstash/event'
=> true
irb(main):002:0> event = LogStash::Event.new
=> #<LogStash::Event:0x62b635fe>
irb(main):003:0> event.set("[location]", [0,0])
=> [0, 0]
irb(main):004:0> event.to_hash
=> {"@timestamp"=>2017-07-24T13:44:29.892Z, "@version"=>"1", "location"=>[0, 0]}
irb(main):005:0> event.set("[lt]", {"py" => 560, "px" => 11})
=> {"py"=>560, "px"=>11}
irb(main):006:0> event.to_hash
=> {"@timestamp"=>2017-07-24T13:44:29.892Z, "@version"=>"1", "lt"=>{"px"=>11, "py"=>560}, "location"=>[0, 0]}
irb(main):007:0> event.set("[location]", [0.5 * event.get("[lt][py]").to_f, 2 * event.get("[lt][px]").to_f])
=> [280.0, 22.0]
irb(main):008:0> event.to_hash
=> {"@timestamp"=>2017-07-24T13:44:29.892Z, "@version"=>"1", "lt"=>{"px"=>11, "py"=>560}, "location"=>[280.0, 22.0]}

Hi @guyboertje
i tried this code but it doesn't work

when i try this the same output for logstash log file came out. and i couldn't find any data in kibana


Here is my log file for logstash after applying the code above.

The first one was the solution for me after try it many times
Thanks @guyboertje

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.