Private networks with GeoIP


(Marcelo) #1

Continuing the discussion from Creating geoip data for internal networks:

I've been studying a way to have my private networks with GeoIP.

I tried to edit the GeoLite2-City.mmdb file and include my networks in it, however, I did not succeed, editing this file seemed very complex, I spent a week trying until I gave up.

I was able to create a .dat file with my networks following the procedures defined in Customizing Maxmind IP Geo DB for Internal Networks but I could not do the logstash work with my custom file, because currently the default format for GeoIp2 is .mmdb and no more .dat.

I read the topic Create Custom geoip database for Logstash 5.2 and I saw the possibility of attaching a .csv file with my networks. I created a yml in the pattern mentioned in the topic:

My log:

www.mydomain.com 192.168.254.146 "http://sapllogstash02.tjpe.gov.br:8080" - - [01/Jul/2018:16:41:44 -0300] "GET /kibana/app/kibana HTTP/1.1" 200 8202 "https://www.mydomain.com/kibana/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:60.0) Gecko/20100101 Firefox/60.0" 127265181 512 456

My config /etc/logstash/conf.d/config.yml:

input { stdin {} }
filter{
  if "apache_error_log" in [tags] {
    grok {
      match => { "message" => '^%{HOSTNAME:VirtualHost} %{IPV4:clientip} "%{NOTSPACE:balancer_worker_name}" - - \[%{HTTPDATE:timestamp}\] "(?:((%{NOTSPACE:Method} %{NOTSPACE:request})|(%{WORD:Method} %{DATA:request} HTTP/%{NOTSPACE:httpversion})|(-)))" %{NOTSPACE:response} (?:%{NUMBER:ResponseSize:int}|-) %{QUOTEDSTRING:referrer} %{QUOTEDSTRING:agent} (?:%{NUMBER:TimeTaken:int}|-) (?:%{NUMBER:BytesReceived:int}|-) (?:%{NUMBER:BytesSents:int}|-)'} 
    }
	translate {
	    regex => true
	    dictionary_path => "/etc/logstash/mutate/jsontranslate.yml"
	    field => "clientip"
	}
	json {
	    source => "translation"
	}
	date {
      match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
    }
  }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
	manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
  }
  stdout {
    codec => rubydebug
  }
}

Recomendation yml /etc/logstash/mutate/jsontranslate.yml:
'192\.168\.254\.*': '{"geoip": {"latitude": -8.0703 , "longitude": -34.8969 , "location": [ -34.8969 , -8.0703 ]}, "country_name": "Brazil", "country_code2": "BR" , "continent_code": "SA" , "region_name": "Pernambuco" , "timezone": "America/Recife" , "region_code": "PE", "country_code3": "BR", "city_name": "RECIFE"}'

However I could not make it work, I tried to include the geoip filter and I did not succeed in the solution.

Finally I read the topic Creating geoip data for internal networks that mentions a way to do via filter using mutate.

I created this filter:

if [clientip] =~ /^192\.168\.254\.*/ { 
	  	mutate { replace => { "[geoip][timezone]" => "America/Recife" } }
        mutate { replace => { "[geoip][continent_code]" => "SA" } }
        mutate { replace => { "[geoip][country_name]"  => "Brazil" } }
        mutate { replace => { "[geoip][region_code]" => "PE" } }
	    mutate { replace => { "[geoip][country_code2]" => "BR" } }
        mutate { replace => { "[geoip][country_code3]" => "BR" } }
        mutate { replace => { "[geoip][region_name]" => "RECIFE" } }
        mutate { remove_field => [ "[geoip][location]" ] }
        mutate { add_field => { "[geoip][location]" => "-34,88" } }
        mutate { add_field => { "[geoip][location]" => "-8,05" } }
        mutate { convert => [ "[geoip][location]","float" ] }
        mutate { replace => [ "[geoip][latitude]","-8,05" ] }
        mutate { convert => [ "[geoip][latitude]","float" ] }
        mutate { replace => [ "[geoip][longitude]","-34,88" ] }
        mutate { convert => [ "[geoip][longitude]","float" ] }
      }

And besides not working I was getting this message in the log:

[2018-07-01T16:52:39,355][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.07.01", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x5474fe7a>], :response=>{"index"=>{"_index"=>"filebeat-2018.07.01", "_type"=>"doc", "_id"=>"qp9nV2QBpG3tMCkdhA6l", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [geoip.location] tried to parse field [null] as object, but found a concrete value"}}}}

I've been trying to make it work for some time and I'm not getting ahead and I do not know where I'm going wrong, is there a simple way to GeoIp my internal networks?


(Mark Walkom) #2

I refer people to this thread, it should still work :slight_smile:


(Marcelo) #3

i've read the topic could-not-index-event and in stackoverflow topic -> logstash-geoip-location-mapping-to-geo-point-not-working and understood my problem.

My index was populated with location field, when i changed my index the location field become populated with my data.

thanks.


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.