Geoip not working while taking input from lumberjack

Hi,

When i send nginx logs using logstash-forwarder i am getting following in response

`172.10.6.182 - - [22/Feb/2016:02:06:53 -0500] "GET /analytics.js HTTP/1.1" 301 184 "http://www.hjhjhjkk.com/searchResult" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36"

"message" => "10.101.3.3 - - [22/Feb/2016:08:10:33 -0500] "GET /analytics.js HTTP/1.1" 200 53192 "http://www.hjhjhjkk.com/property/" "Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; Touch; TNJB; rv:11.0) like Gecko" "-"",
"@version" => "1",
"@timestamp" => "2016-02-22T13:10:36.310Z",
"type" => "nginx_access.log",
"file" => "/var/log/nginx/access.log",
"host" => "xxxxxx",
"offset" => "1048192",
"clientip" => "10.101.3.3",
"ident" => "-",
"auth" => "-",
"timestamp" => "22/Feb/2016:08:10:33 -0500",
"verb" => "GET",
"request" => "/analytics.js",
"httpversion" => "1.1",
"response" => "200",
"bytes" => "53192",
"referrer" => "http://www.hjhjhjkk.com/property/",
"agent" => ""Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; Touch; TNJB; rv:11.0) like Gecko""
}`

When i take input from stdin

I am getting some extra fields which actually are nothing but the derived longitude and latitude from the geodatabase.

"geoip" => { "ip" => "10.10.10.10", "country_code2" => "US", "country_code3" => "USA", "country_name" => "United States", "continent_code" => "NA", "latitude" => 38.0, "longitude" => -97.0, "dma_code" => 0, "area_code" => 0, "location" => [ [0] -97.0, [1] 38.0 ], "coordinates" => [ [0] -97.0, [1] 38.0 ] }

Any idea why..?
In the first case I am not able to see anything in graph...but it works fine in the stdin inputs for the same log events.

My conf file

input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
stdin{}
}
filter{

if [type] == "nginx_access.log" {
grok {
match => { "message" => "%{NGINXACCESS}" }
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
}
output{
stdout{codec => rubydebug}
elasticsearch {hosts => "xxxxxxxx:9200" }
}

I'm a bit surprised that you're getting anything useful from a GeoIP lookup of 10.10.10.10 since the whole 10.0.0.0/8 network isn't routable on the Internet and no addresses in that range have a geolocation. I can only assume that the geoip filter doesn't get a match for, in this case, 10.101.3.3 and therefore doesn't add any fields.

Check out Creating geoip data for internal networks for an option for geoip on internal networks.

Those IP's are changed just to be sure that I am not exposing anything to Internet. Sorry..:slight_smile:

Thanks, It seems I have to do something like that so that internal network is also mapped.