Logstash geoip filter not working properly for certain public IPs when pushing data to es

I am pushing aws ELB logs to elasticsearch with using below config as:

        input {
      s3 {
        type => "elb-logs"
        bucket => "<bucket name>"
        region => "us-east-1"
        sincedb_path => "/var/lib/logstash/since.db"
      }
    }

    filter {
      if [type] == "elb-logs" {
        grok {
          match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb_name} %{IP:elb_client_ip}:%{INT:elb_client_port:int} (?:%{IP:elb_backend_ip}:%{NUMBER:elb_backend_port:int}|-) %{NUMBER:elb_request_processing_time:float} %{NUMBER:elb_backend_processing_time:float} %{NUMBER:elb_response_processing_time:float} (?:%{INT:elb_status_code:int}|-) (?:%{INT:elb_backend_status_code:int}|-) %{INT:elb_received_bytes:int} %{INT:elb_sent_bytes:int} \"(?:%{GREEDYDATA:elb_request}|-)\" \"(?:%{GREEDYDATA:elb_userAgent}|-)\" %{NOTSPACE:elb_sslcipher} %{NOTSPACE:elb_sslprotocol}"]
          match => ["message", "%{GREEDYDATA:elb_event_name} for ELB: %{NOTSPACE:elb_name} at %{TIMESTAMP_ISO8601:timestamp}"]
        }
        if [elb_request] =~ /.+/ {
          grok {
            match => ["elb_request", "(?:%{WORD:elb_http_method}) (?:%{DATA:elb_http_path})? (?:%{DATA:elb_http_type}/%{NUMBER:elb_http_version:float})?|%{GREEDYDATA:rawrequest}"]
          }
        }
        if [elb_http_path] =~ /.+/ {
          grok {
            match => ["elb_http_path", "(?:%{WORD:elb_http_path_protocol}://)?(%{NOTSPACE:elb_http_path_site}:)?(?:%{NUMBER:elb_http_path_port:int})?(?:%{GREEDYDATA:elb_http_path_url})?"]
          }
        }
        fingerprint {
            source => ["message"]
            target => "[@metadata][fingerprint]"
            method => "MURMUR3"
         }
       geoip {
          source => "elb_client_ip"
          fields => ["city_name", "continent_code", "country_code2", "country_code3","country_name", "dma_code", "postal_code", "region_code", "region_name", "location", "latitude", "longitude","region_name","timezone"]
        }
       useragent {
          source => "elb_userAgent"
       }
      }
    }
    output {
      if [type] == "elb-logs" {
        elasticsearch {
           hosts => ["<host>:port"]
           ilm_enabled => false
           index => "aws-elblogs-%{+YYYY.MM.dd}"
           document_id => "%{[@metadata][fingerprint]}"
        }
      }
    }

All of the data is pushed fine in elasticsearch with correct mapping, but for some ips it was not giving proper latitude and longitude and also not adding proper fields as :

city_name 
continent_code
country_code2
country_code3
country_name
dma_code
region_code
region_name

example logs as :

    2019-11-23T10:30:10.103697Z <ELBname> **2.16.106.29**:44924 172.18.10.94:31700 0.000045 0.015174 0.000044 200 200 0 1893 "GET  <domain> HTTP/1.1" "Pingdom.com_bot_version_1.4_(http://www.pingdom.com/)" ECDHE-RSA-AES128-GCM-SHA256 TLSv1.2

For above log it was giving latitude and longitude as 42 and 8.

it is happening for one series of IPs as 2.*.*.*. All other document were fine.

I tried debugging mode with using stdout method, the result was fine with no issue but in es it is not correct.

Any help here ?

Another example in debug mode :

[2020-03-26T19:32:07,537][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@version"=>"1", "message"=>"2019-11-11T21:18:39.394297Z <elbname> 2.18.240.101:51393 172.18.4.102:30207 0.000043 0.003012 0.000026 200 200 0 0 \"HEAD <domain>/ HTTP/1.1\" \"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36\" ECDHE-RSA-AES128-GCM-SHA256 TLSv1.2\n", "@timestamp"=>2020-03-26T19:32:06.822Z, "type"=>"elb-logs"}}
[2020-03-26T19:32:07,634][DEBUG][logstash.pipeline        ] output received {"event"=>{"elb_status_code"=>200, "elb_http_version"=>1.1, "elb_http_method"=>"HEAD", "elb_http_path_port"=>443, "elb_client_ip"=>"2.18.240.101", "os"=>"Windows", "elb_backend_ip"=>"172.18.4.102", "elb_request_processing_time"=>4.3e-05, "patch"=>"2214", "elb_client_port"=>51393, "elb_sslprotocol"=>"TLSv1.2", "elb_http_path_protocol"=>"https", "elb_backend_port"=>30207, "os_name"=>"Windows", "build"=>"", "message"=>"2019-11-11T21:18:39.394297Z <elbname> 2.18.240.101:51393 172.18.4.102:30207 0.000043 0.003012 0.000026 200 200 0 0 \"HEAD <domain> HTTP/1.1\" \"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36\" ECDHE-RSA-AES128-GCM-SHA256 TLSv1.2\n", "elb_backend_processing_time"=>0.003012, "elb_sent_bytes"=>0, "elb_http_path_url"=>"/", "name"=>"Chrome", "elb_sslcipher"=>"ECDHE-RSA-AES128-GCM-SHA256", "minor"=>"0", "elb_name"=>"<elbname>", "elb_received_bytes"=>0, "geoip"=>{"timezone"=>"Europe/Vaduz", "latitude"=>47.0, "continent_code"=>"EU", "location"=>{"lat"=>47.0, "lon"=>8.0}, "longitude"=>8.0}, "type"=>"elb-logs", "elb_http_type"=>"HTTP", "elb_response_processing_time"=>2.6e-05, "elb_http_path_site"=>"<site>", "@timestamp"=>2020-03-26T19:32:06.822Z, "elb_http_path"=>"<domain>", "@version"=>"1", "timestamp"=>"2019-11-11T21:18:39.394297Z", "elb_userAgent"=>"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36", "device"=>"Other", "elb_request"=>"HEAD <domainname> HTTP/1.1", "major"=>"40", "elb_backend_status_code"=>200}}

It is resolved. There was some issue in default database. I manually downloaded db and change point to that db.

database => "/etc/logstash/GeoLite2-City.mmdb"

It worked. :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.