Logstash Geoip Filter with a Custom Database

We are looking into using the geoip filter to basically search through IP addresses and blocks to get information that is not actually geographic. We've followed the instructions for creating a new .mmdb file (https://blog.maxmind.com/2015/09/29/building-your-own-mmdb-database-for-fun-and-profit/) and the new database works with the example script to search it (ie, with MaxMind::DB::Reader), but it does not work with the logstash geoip filter.

So far, we have determined that, to avoid fatal errors, we need to make the database's type "GeoIP2-CIty" and use only field names that are found in the real City database (which we can translate into the field names we want later) . But even with these changes, we are getting no results in the logstash output, just {} and a geoip_lookup_failure.

Anyone know if this can be done or have ideas on what else needs to try?

Also, IF we can get it to work, will we get longest-prefix matching out of it? The reason we are trying this instead of using a Translate filter is that we have some records with individual IPs (/32) and some with IP blocks (eg, /28) which may contain the individuals. If an IP matches more than one record, we want the record with the /32.

Thanks!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.