Encountering _geoip_lookup_failure within the IP address range 104.16.16.0/20

Could someone clarify why I consistently encounter the tags in my logstash pipeline "_geoip_lookup_failure" issue with the IP address range 104.16.16.0/20, despite it working fine for other public IP addresses?

Hi @yogeshk04 Welcome to the Community

Kibana Dev Tools, which uses the same data base

Looks like it is simply not in the database.

These are "Free GeoIPLite2" databases which are included with Logstash / Elastic from MaxMind if you need a better one you can purchase the Commercial Version and Download

POST _ingest/pipeline/geoip/_simulate
{
  "pipeline": {
    "processors": [
      {
        "geoip": {
          "field": "ip"
        }
      }
    ]
  },
  "docs": [
    {
      "_source": {
        "ip": "8.8.8.8"
      }
    },
    {
      "_source": {
        "ip": "104.16.16.1"
      }
    }
  ]
}

# Results
{
  "docs": [
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "ip": "8.8.8.8",
          "geoip": {
            "continent_name": "North America",
            "country_name": "United States",
            "location": {
              "lon": -97.822,
              "lat": 37.751
            },
            "country_iso_code": "US"
          }
        },
        "_ingest": {
          "timestamp": "2024-03-07T18:00:03.282766764Z"
        }
      }
    },
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "ip": "104.16.16.1"
        },
        "_ingest": {
          "timestamp": "2024-03-07T18:00:03.28278427Z"
        }
      }
    }
  ]
}

Hello @stephenb ,

Thank you very much. I'm quite new to ELK.
I performed a post action using the following:
POST _ingest/pipeline/_simulate
{ }
and received a 200 response, but I'm still encountering the _geoip_lookup_failure tags.
Is there something I might be overlooking? as you mention POST _ingest/pipeline/geoip/_simulate but I am getting an error with this.

Thank you.

What is your logstash configuration? Can you share the part of where you are using the geoip filter?

Which is the IP that is failing? is one specific IP address or every IP in that range?

I tested here and it is working, both the City and ASN databases.

[2024-03-07T18:00:40,154][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
{
       "message" => "104.16.16.1",
         "geoip" => {
          "country_name" => "United States",
                    "ip" => "104.16.16.1",
         "country_code2" => "US",
                "as_org" => "Cloudflare, Inc.",
              "latitude" => 37.751,
              "timezone" => "America/Chicago",
             "longitude" => -97.822,
        "continent_code" => "NA",
         "country_code3" => "US",
                   "asn" => 13335,
              "location" => {
            "lon" => -97.822,
            "lat" => 37.751
        }
    },
    "@timestamp" => 2024-03-07T21:00:40.151899693Z
}
[2024-03-07T18:00:41,124][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}

1 Like

@leandrojmp Huh interesting, it works in Logstash vs Elasticsearch I thought they used the same DB...

Hmm says same DB

Geoip filter plugin
This plugin is bundled with GeoLite2 City database out of the box. From MaxMind’s description — "GeoLite2 databases are free IP geolocation databases comparable to, but less accurate than, MaxMind’s GeoIP2 databases". Please see GeoIP Lite2 license for more details.

@yogeshk04 Sorry to confuse you you need to be in Kibana Dev Tools to run the command I showed. I ran the GeoIP lookup with the Database that is loaded into Elasticsearch I thought it was the same as logstash... apparently that is not the case even though our Documents Imply that.

That's weird, the documentation mentions that they use the same database, and logstash download the databases directly from Elastic.

The default endpoint to download the geoip database is the one below.

# X-Pack GeoIP Database Management
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-geoip.html#plugins-filters-geoip-manage_update
#xpack.geoip.downloader.enabled: true
#xpack.geoip.downloader.endpoint: "https://geoip.elastic.co/v1/database"

Which is exactly the same one used by Elasticsearch.

Elasticsearch automatically downloads updates for these databases from the Elastic GeoIP endpoint: https://geoip.elastic.co/v1/database.

Maybe a bug in Elasticsearch? I tested on 8.12.1

But also, the error of the OP seems to be on Logstash, not on Elasticsearch.

Yes it is logtsash ... but in the past I could always do the quick test and get the same results...

If I try to use the offline (non-EULA) database I can lookup 104.16.16.1. That said, I think there is at least one bug in offline usage, I will verify tomorrow.

I was initially able to do the lookups with the online database.

But then I forced a download of a new DB and now I consistently cannot lookup 104.16.16.1. When I do the lookup at MaxMind itself then I get no location info, so it looks like they took it out of the DB. They know it is from Cloudfare, but not where it is.

I checked the Do Not Sell My Info exclusions listed at MaxMind and this subnet does not currently show up.

I used to do geoip lookup hits to a very widely used payment processor, and every day we got a huge number of hits from all over the planet. When I mapped those results in Kibana there was an obvious spike at the corporate headquarters of the CDN that fronted our websites. They were shielding us from our clients IP addresses in some cases, so geoip data for those hits was content free. It made the mapping data worse, not better. (There was another spike in northern Europe that I though might be EUNIC, but I could not afford to spend the time to prove that.)

The free data itself is not that good, a lot of it is warmed-over whois data, which can often be located to the corporate headquarters of a small company that owned a /16 and got rolled up by a big cable company.

1 Like

Hi @stephenb,

Thank you so much for your reply.
Yes, using Dev Tools. I tried this solution but instead "POST _ingest/pipeline/geoip/_simulate" I got successful result using "POST _ingest/pipeline/_simulate" this post method and endpoints. The output of this is same as you mentioned in you reply but it is still not adding geoip for ip range 104.16.16.0.

Hello @Badger,

Thank you so much for your reply.

Hello @leandrojmp,

Thank you for your reply.

I am using Filter plugins geoip and using following if statement inside filter.

if [ip] {
    if [ip] == "127.0.0.1" or ![ip] {
      ruby {
        code => '
          event.set("[ip]", nil)
        '
      }
    } else if !([ip] =~ /^10\.|^172\.(1[6-9]|2[0-9]|3[0-1])\.|^192\.168\./) {
      geoip {
        source => "[ip]"
        target => "geoip"
      }
    }
  }

Please let me know if you need any other information. Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.