Create a field based on IP Range

Hello everyone!
I currently have logstash setup to take in flow data. I also had a file with the in the yaml format with ip-range (key), autonomous system number (value) pairs. I wanted to know how to use the lookup, filter, or dns plugin to create a field by looking up the ip and receiving the AS number.

Thanks,
N

Maybe not what you are asking, but the GeoIP filter can be used with Maxmind's ISP/ASN database (http://geolite.maxmind.com/download/geoip/database/GeoLite2-ASN.tar.gz) which will get you an ASN for an IP address.

I actually came across this and this would be perfect for my use however I am having a hard time putting this in my config file without getting an error.

Here is the filter I placed in my config file:

geoip {
    source => "[sflow][srcIP]"
    database => "/etc/logstash/dictionaries/GeoLite2-ASN_20170718/GeoLite2-ASN.mmdb"
    fields => "autonomous_system_number"
}

And this is the error that appears in the logstash logs:

[2017-07-21T16:35:06,532][ERROR][logstash.filters.geoip   ] Unknown error while looking up GeoIP data {:exception=>java.lang.UnsupportedOperationException: Invalid attempt to open a GeoLite2-ASN database using the city method, :field=>"[sflow][srcIP]"
[2017-07-21T16:35:06,539][ERROR][logstash.pipeline        ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"Invalid attempt to open a GeoLite2-ASN database using the city method", "backtrace"=>["com.maxmind.geoip2.DatabaseReader.get(com/maxmind/geoip2/DatabaseReader.java:150)", "com.maxmind.geoip2.DatabaseReader.city(com/maxmind/geoip2/DatabaseReader.java:217)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "RUBY.filter(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/lib/logstash/filters/geoip.rb:160)", "RUBY.do_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145)", "RUBY.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161)", "RUBY.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43)", "RUBY.initialize((eval):262)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.initialize((eval):256)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:281)", "RUBY.filter_func((eval):187)", "RUBY.filter_batch(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:370)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:281)", "RUBY.each(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1342)", "RUBY.each(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223)", "RUBY.filter_batch(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:369)", "RUBY.worker_loop(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:350)", "RUBY.start_workers(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:317)", "java.lang.Thread.run(java/lang/Thread.java:748)"]}

"Invalid attempt to open a GeoLite2-ASN database using the city method"

I noticed that it keeps calling the .city method and that leads to the error. Do you know how I can solve this?

Ahh! This was a bug we fixed in about 2 months ago.

You have two options for getting this bug fix:

  1. Upgrade Logstash to v5.4.2 or v5.5.0 (whichever is your preference). BOth of these versions include the fix I describe.

  2. Alternately, upgrade just the geoip filter: bin/logstash-plugin update logstash-filter-geoip. Your logstash includes geoip filter v4.0.4, and the fix is available in v4.1.1 and later.

Once you upgrade (either logstash or just the geoip filter), your config should work.

Thanks! I updated the filter and the error no longer appeared. The logs show that it is using the database:

[2017-07-22T21:54:22,911][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/etc/logstash/dictionaries/GeoLite2-ASN_20170718/GeoLite2-ASN.mmdb"}
[2017-07-22T21:54:22,933][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/etc/logstash/dictionaries/GeoLite2-ASN_20170718/GeoLite2-ASN.mmdb"}

However, when I look in kibana, every record has the _geoip_lookup_failure tag. To see if it was a problem with the database file I was using or if it was logstash, I queried the same source and destination IP using a python API from the maxminddb module. However, the query returned a valid result object. Do you know what is happening in the lookup process that is causing this?

I got it figured out! I messed up my source field. Thank you for your help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.