Jdbc logstash module pulling IP in decimal format

Hi,

I am pulling events from sqlserver using jdbc module in logstash.

I get all events I queried for, but IP is coming in different format. I think it is decimal format. I want them in regular IP notation a.b.c.d format.

examples:
1411339896
-1979711478
8485306
999744501

Any suggestions?

You would use a ruby filter

input { generator { count => 1 lines => [ '1411339896', '-1979711478', '8485306', '999744501' ] } }
filter {
    ruby {
        code => '
            n = event.get("message").to_i
            event.set("ip", [n].pack("N").unpack("CCCC").join("."))
        '
    }
}
output  { stdout { codec => rubydebug { metadata => false } } }

produces events like

        "ip" => "138.0.0.10",
   "message" => "-1979711478"

        "ip" => "0.129.121.186",
   "message" => "8485306"

I am somewhat skeptical of the second one .

I applied that, but I get 0.0.0.0 for all values.

Filter:

filter {
  ruby {
    code => '
        n = event.get("IPv4").to_i
        event.set("IP", [n].pack("N").unpack("CCCC").join("."))
    '
  }
}

I tested above just with stdin and stdout, and it works fine as it should be, but with proper jdbc input and elasticsearch output, I get 0.0.0.0 for all values.

That would suggest event.get("IPv4").to_i is returning zero.

If you are running on the command line try adding

puts "IPv4 is #{IPv4}"

after the event.get. Are you sure it is IPv4 and not, say, IPV4?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.