Don't know how to make elasticsearch recognise the location coordenates properly for Map "visualitzation" on Kibana

Hi,

Recently, I had to face a project where I had to locate on a map the different issues we have.

I've been trying for a week how to do it with logstash but I don't know why is not working...

Here you have my ".conf":

input {
  file {
  path => "/opt/test05.csv"
  sincedb_path => "/dev/null"
  mode => "read"
  ignore_older => "29 d"
  file_completed_action => "delete"
  }
}

filter {
  grok {
    patterns_dir => ["/opt/paterns"]
    match => {
      "message" => "^%{DATA:errorid};%{USERNAME:errorcode};%{USERNAME:clientid};%{DATA:latlon:longitude};%{DATA:latlon:latitude};%{INT:connec_status};%{MINEDATE:erroruptime};%{IPV4:ip}$"
    }
  }
  geoip {
    source => "latlon"
    target => "geoip"
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }
  mutate {
    convert => {
      "[connec_status]" => "integer"
      "[latlon]" => "float"
      "[geoip][latitude]" => "float"
      "[geoip][longitude]" => "float"
      "[geoip][coordinates]" => "float"
    }
    copy => {
      "p05latlon" => "[geoip][coordinates]"
    }
  }
  date {
    match => [ "erroruptime", "dd/MM/YYYY HH:mm" ]
    locale => en
    remove_field => ["timestamp"]
  }
  if ("_grokparsefailure" in [tags]) {
    drop{}
  }
}

output {
  elasticsearch {
    hosts => ["127.0.0.1:9200"]
    index => "test05"
  }
  stdout {}
}

The data I introduce:

829616001458;CD0678-TR1;34678792401;63.607171; 10.801033;1;24/10/2019 9:50;1XX.2X.1XX.1XX
829616001468;S68254-TR2;34686339212;63.584253; 10.736047;1;24/10/2019 15:20;1XX.2X.1XX.5XX
829616001486;S53348-TR1;34686334689;63.789019; 9.709261;1;24/10/2019 14:44;1XX.2X.1XX.5X
829616001498;S23192-TR1;34662624163;63.839019; 9.939261;1;24/10/2019 9:31;1XX.2X.1XX.2XX

And Kibana:


(I covered or edit some sensible data that has nothing to do with the error)

The thing is that it says the location is on Arabia Saudi but is not even close... and for some reason there is no parameter that recognize the location for the Kibanas "visualization".

I know maybe this can look stupid issue but as I could not solve it with what I found on the net, I ask it here.

Thank you.

Hi @HikariNTB

You need to create a mapping with geo_point data type first before you index the data.

See an explanation here

1 Like

The quality of the geolocation data available for free is not very good.

1 Like

Yes, thank you so much.

My main problem was to think I need "geoip" on my conf file for logstash and the fact I didn't do mapping for this.

So if anyone have the same problem where you need to geolocate some point and you have no ip related to that (like for a housing database or log).

You just need to apply this mapping and then the conf doc like this:

PUT test09
{
"mappings": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}

input {
file {
path => "/opt/test09.csv"
sincedb_path => "/dev/null"
mode => "read"
ignore_older => "29 d"
file_completed_action => "delete"
}
}

filter {
grok {
patterns_dir => ["/opt/paterns"]
match => {
"message" => "^%{DATA:errorid};%{USERNAME:errorcode};%{USERNAME:clientid};%{DATA:location};%{INT:connec_status};%{MINEDATE:erroruptime};%{IPV4:ip}$"
}
}
date {
match => [ "erroruptime", "dd/MM/YYYY HH:mm" ]
locale => en
remove_field => ["timestamp"]
}
if ("_grokparsefailure" in [tags]) {
drop{}
}
}

output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "test09"
}
stdout {}
}

Hope this can help anyone!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.