Geoip field is not showing in dashboard


(Shubhrant Chauhan) #1

I'm trying to locate the geo location on apache server but the geoip field is not showing

this is logstash configuration file

input {
beats {
port => 5044
}
}

filter {
if [type] == "apache" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
}
}

}

output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

and the raw log is "- - - [11/Jan/2017:16:30:41 +0530] "GET /report_generator_hartron/index.php HTTP/1.1" 200 596 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/55.0.2883.87 Chrome/55.0.2883.87 Safari/537.36""

1 more thing I want to tell that when I added this geoip field into logstash filter then under tag section one error is coming i.e. "geoip_lookup_failure"


(Magnus Bäck) #2

The geoip filter fails because there is no clientip field. There's no clientip field because the grok filter fails. The grok filter fails because

  • in the text example you gave there is no clientip field, and
  • in the screenshot the log clearly isn't in Combined format.

(Shubhrant Chauhan) #3

sorry for the late reply ...

Actually Now I've made some changes in the apache configuration file and the client IP field is now coming you can check the screenshot

but still the same problem with geoIP

and now log message looks like
"xxx.xxx.xxx.77 - - [12/Jan/2017:10:36:39 +0530] "GET /report_generator_hartron/pagination.js HTTP/1.1" 404 532 "http://xx.xx.xx.xx/report_generator_hartron/showreport.php" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/55.0.2883.87 Chrome/55.0.2883.87 Safari/537.36"
"


(Magnus Bäck) #4

GeoIP lookups don't work for RFC1918 addresses like 10.228.12.77.


(Shubhrant Chauhan) #5

So what should I do for this sir..?


(Magnus Bäck) #6

Perhaps you can use the translate filter to map your 10.0.0.0/8 addresses to geographic locations? If you don't want to list all possible addresses then the cidr filter should be helpful. A combination of the two might be the best option.

You obviously need to obtain and maintain the mapping between your 10.0.0.0/8 addresses and geolocations. Nobody can help you with that.


(Shubhrant Chauhan) #7

Ok

I put one sample file for apache logs which has ip address like 46.119.114.245

Still geoip field is not reflecting

And this is the logstash log message if you need it

[2017-01-12T10:31:24,619][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}


(Magnus Bäck) #8

I don't know what's up in your case. You're not getting a _geoip_lookup_failure tag so it looks like it's succeeding. Also, it works fine for me with Logstash 2.4.0:

$ echo '46.119.114.245' | /opt/logstash/bin/logstash -e 'input { stdin {} } filter { geoip { source => "message" } } output { stdout { codec => rubydebug } }'
Settings: Default pipeline workers: 8
Pipeline main started
{
       "message" => "46.119.114.245",
      "@version" => "1",
    "@timestamp" => "2017-01-12T07:15:02.731Z",
          "host" => "lnxolofon",
         "geoip" => {
                    "ip" => "46.119.114.245",
         "country_code2" => "UA",
         "country_code3" => "UKR",
          "country_name" => "Ukraine",
        "continent_code" => "EU",
              "latitude" => 49.0,
             "longitude" => 32.0,
              "location" => [
            [0] 32.0,
            [1] 49.0
        ]
    }
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

And this is the logstash log message if you need it

That message is completely unrelated.


(Shubhrant Chauhan) #9

same output I got with the above command

  • Is it possible to create own file of internal ip addresses that will use at the time of GeoIP field ??

(Magnus Bäck) #10

same output I got with the above command

Then there's something else in your configuration that removes the geoip field. Start by commenting out your elasticsearch output and use a simple stdout { codec => rubydebug } output. Does that make a difference?

Clearly your Logstash is capable of looking up the IP address in question so it shouldn't be hard to narrow down the problem.

Is it possible to create own file of internal ip addresses that will use at the time of GeoIP field ??

The geoip filter supports custom GeoIP databases, so I suppose you should be able to create your own such database with your internal addresses. I don't know the details.


(Shubhrant Chauhan) #11

I've changed my config to

input {
beats {
port => 5044
}
}

filter {
if [type] == "apache" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
}
}
}

output {
stdout { codec => rubydebug }

}
but still it is not showing


(Magnus Bäck) #12

Continue with the simplification. What if you replace the beats input with a stdin input and pipe the same log to Logstash?


(Shubhrant Chauhan) #13

same, output is not showing ....

I'll do one thing that - i'll try to make my apache server on public ip, then maybe it will show...

ok Magnus thanks for your valuable time ... :slight_smile:


(system) #14

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.