Help with GeoLocation please!

Looking for some help please. Im new to Elastic and im finding it very enjoyable!!

My elk stack is running on a windows server

ok so my GEO situation is...

Filebeat on a IIS webserver sending iis logs to logstash server. - this works great

logstash is parsing logs and forwarding to Elasticsearch - this also works great (My first time with GROK)

Here is my logstash conf

input {
beats {
port => 5044
type => "log"

filter {

Ignore the comments that IIS will add to the start of the W3C logs

if [message] =~ "^#" {
drop {}

grok {
## Very helpful site for building these statements:
# This is configured to parse out every field of IIS's W3C format when
# every field is included in the logs
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]

Set the Event Timesteamp from the log

date {
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
timezone => "Etc/UTC"

If the log record has a value for 'bytesSent', then add a new field

to the event that converts it to kilobytes

if [bytesSent] {
ruby {
code => "event['kilobytesSent'] = event['bytesSent'].to_i / 1024.0"

Do the same conversion for the bytes received value

if [bytesReceived] {
ruby {
code => "event['kilobytesReceived'] = event['bytesReceived'].to_i / 1024.0"

Perform some mutations on the records to prep them for Elastic

mutate {
## Convert some fields from strings to integers
convert => ["bytesSent", "integer"]
convert => ["bytesReceived", "integer"]
convert => ["timetaken", "integer"]

## Create a new field for the reverse DNS lookup below
add_field => { "clientHostname" => "%{clientIP}" }

## Finally remove the original log_timestamp field since the event will
#   have the proper date on it
remove_field => [ "log_timestamp"]


Do a reverse lookup on the client IP to get their hostname.

dns {
## Now that we've copied the clientIP into a new field we can
# simply replace it here using a reverse lookup
action => "replace"
reverse => ["clientHostname"]

Parse out the user agent

useragent {
    source=> "useragent"
    prefix=> "browser"

geoip {
source => "clientIP"
target => "geoip"
add_tag => [ "iis-geoip" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}"]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
mutate {
convert => [ "[geoip][coordinates]", "float" ]


We're only going to output these records to Elasticsearch so configure


output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

Kibana is displaying the logs great, I can view them in discover and make visualisations. also using some nice templates in the dashboard. All in all im very happy with how its going.

But.. I just cant get any Geo goodness working!! I go to visualise and tilemap, select filebeat for the index and for the bucket i select Geo_Point then client_location.

but theres no data to display in the tile map

Heres a copy of the filebeat template.json that i applied against elasticsearch

"mappings": {
"default": {
"_all": {
"enabled": true,
"norms": {
"enabled": false
"dynamic_templates": [
"template1": {
"mapping": {
"doc_values": true,
"ignore_above": 1024,a
"index": "not_analyzed",
"type": "{dynamic_type}"
"match": ""
"properties": {
"@timestamp": {
"type": "date"
"client_location": {
"type": "geo_point"
"message": {
"type": "string",
"index": "analyzed"
"offset": {
"type": "long",
"doc_values": "true"
"settings": {
"index.refresh_interval": "5s"
"template": "filebeat-

any help is much appreciated

You are targeting the geoip field in your filter, but you have client_location in the mapping.
You need to resolve that discrepancy.

Thanks Mark. So could I change the filter to client location? Or should I change the template.

Either, or, whatever is easier for you really.

Not related to your question, but I find those filter perhaps not necessary and may slow down your Logstash. For bytesSent field, you can change the data format in Kibana so that it dynamically display Bytes, KBytes, or MBytes, so there is no need for a separate kilobytesReceived

Plus I don't think you need to convert string field to integer using Logstash plugin, instead, specify data type in your ES mapping template.

Excellent I'll give it a try tomorrow and let you know. Thanks.

Thanks ash. I'll give it a try.

Well still no joy with the geo ip. I repaired the discrepancies and rebuilt the file built index.

Any other ideas guys?