Problem converting latitude and longitude into a geo point for Kibana

I am having difficulty getting Logstash (2.2.4) to convert two parsed values, longitude and latitude, into a geo point for Kibana. The documentation for Elasticsearch presents an example for Geo Points where you make some sort of configuration file (https://www.elastic.co/guide/en/elasticsearch/guide/current/geopoints.html). I do no know:

  1. Where does it go?
  2. Is it overriding the elasticsearch template used by Logstash?
  3. Is there a pre-existing filter plugin for Logstash that can take an latitude and longitude to produce a geo point? The only thing that I am aware of is geoip. The assumption of geoip is that the IP is fixed to a set latitude and longitude. In my case the IP is mobile with a new position every few seconds.

My Logstash is as follows:

input {
  file {
    path => "/opt/project/mylog.txt"
  }
}

filter {
  grok {
    add_tag => [ "project", "message1" ]
    match => { "message" => "%{DATE_US:date} %{TIME:time} message1: name:%{DATA:name}, lat:%{DATA:latitude}, lon:%{DATA:longitude}, alt:%{DATA:altitude}" }
  }

  grok {
    add_tag => [ "project", "meessage2" ]
    match => { "message" => "%{DATE_US:date} %{TIME:time} meessage2: name:%{DATA:name}, lat:%{DATA:latitude}, lon:%{DATA:longitude}, alt:%{DATA:altitude}, delta:%{DATA:delta}, status:%{WORD:status}" }
  }

 if "message1" in [tags]
 {
   mutate {
     add_field => { "[location][lat]" => "%{latitude}"
                    "[location][lon]" => "%{longitude}"
     }
   }

   mutate {
      convert => {
        "[location][lat]" => "float"
        "[location][lon]" => "float"
     }
   }
 }

 mutate {
    convert =>{
                "latitude" => "float"
                "longitude" => "float"
                "altitude" => "float"
                "delta" => "float"
        }
   }
}

output {
    if "message1" in [tags] and "project" in [tags]
    {
      elasticsearch
      {
             index => "project-%{+YYYY.MM.dd}"
             manage_template => "false"
             template => "/etc/logstash/templates/project-elasticsearch.json"
      }

      file {
         path => "/opt/project/message1.txt"
      }
    }
    else if "message2" in [tags] and "project" in [tags]
    {
      elasticsearch
      {
             index => "project-%{+YYYY.MM.dd}"
      }

      file {
         path => "/opt/project/message2.txt"
      }

    }
}

Now here is the template I copied from the logstash elasticsearch output plugin and modified to try an tell logstash to modify the "location" object to be considered as a geo point.

{
  "template" : "thunderstorm-*",
  "settings" : {
    "index.refresh_interval" : "5s"
  },
  "mappings" : {
    "_default_" : {
      "_all" : {"enabled" : true, "omit_norms" : true},
      "dynamic_templates" : [ {
        "message_field" : {
          "match" : "message",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" }
          }
        }
      }, {
        "string_fields" : {
          "match" : "*",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" },
            "fields" : {
              "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
            }
          }
        }
      } ],
      "properties" : {
        "@timestamp": { "type": "date" },
        "@version": { "type": "string", "index": "not_analyzed" },
        "geoip"  : {
          "dynamic": true,
          "properties" : {
            "ip": { "type": "ip" },
            "location" : { "type" : "geo_point" },
            "latitude" : { "type" : "float" },
            "longitude" : { "type" : "float" }
          }
        },
	"location": {
	    "type": "geo_point"
        }
      }
    }
  }
}

I would appreciate any insight into this problem. As well I would appreciate any suggestions for improvement since I relative new to Logstash/Elasticsearch/Kibana.

You need a single field with lat and lon in it, not a nested field. eg

mutate {
  add_field => [ "[geoip][location]", "%{longitude}" ]
  add_field => [ "[geoip][location]", "%{latitude}" ]
}

I got an error about duplicate keys from using the suggestion exactly as it is written. I think what you are expressing is not an exact syntax but merely representative of a single field for each. I change my configuration to be:

if "position" in [tags]
{
   mutate {
     add_field => { "[geoip][latitude]" => "%{latitude}" }
     add_field => { "[geoip][longitude]" => "%{longitude}" }
   }

   mutate {
     convert => {
       "[geoip][latitude]" => "float"
       "[geoip][longitude]" => "float"
    }
   }
 }

Is more in line with what with your suggestion?

Changing the output to what I posted today still does not make the latitude and longitude into a geo point when I view the data in Kibana.

Here is what Kibana sees for the JSON:

{
  "_index": "project-2016.06.16",
  "_type": "logs",
  "_id": "AVVagllAqFBA57QBrgLU",
  "_score": null,
  "_source": {
    "message": "06/16/2016 14:38:58 Position: name:SampleDFDevice, lat:34.06691243701163, lon:-81.15418291973535, alt:1000.0",
    "@version": "1",
    "@timestamp": "2016-06-16T18:38:59.299Z",
    "path": "/opt/system/mylog.txt",
    "host": "localhost.localdomain",
    "date": "06/16/2016",
    "time": "14:38:58",
    "name": "SampleDFDevice",
    "latitude": 34.06691243701163,
    "longitude": -81.15418291973535,
    "tags": [
      "project",
      "position",
      "_grokparsefailure"
    ],
    "location": {
      "latitude": 34.06691243701163,
      "longitude": -81.15418291973535
    }
  },
  "fields": {
    "@timestamp": [
      1466102339299
    ]
  },
  "sort": [
    1466102339299
  ]
}

I do not see how location is considered a geo point. Should there be another entry in location for making it a geo point?

Have a read of https://www.elastic.co/guide/en/elasticsearch/reference/2.3/geo-point.html

I see the script that I can use to set the mapping. I put the following into /etc/elasticsearch/templates in a file called location.json:

{
    "location_mapping": {
        "my_type": {
            "properties": {
                "location": {
                    "type": "geo_point"
                }
            }
        }
    }
}

Was this the direction you are suggesting?

As per the docs, your naming is wrong;

Geo-point expressed as an object, with lat and lon keys.

"location": {
      "lat": 34.06691243701163,
      "lon": -81.15418291973535
    }

Mark,

I appreciate your patience and perseverance in helping me with the problem. I have re-read the geo point documentation page and see the line you have quoted. I see that it gives an example JSON that describes an geo point object as:

PUT my_index/my_type/1
{
  "text": "Geo-point as an object",
  "location": { 
    "lat": 41.12,
    "lon": -71.34
  }
}

I changed the logstash file to read:

input { 
    file {
         path => "/opt/event/mylog.txt"
    }
}

filter {
    grok {
        add_tag => [ "myproject", "position" ]
        match => { "message" => "%{DATE_US:date} %{TIME:time} Position: name:%{DATA:name}, lat:%{DATA:latitude}, lon:%{DATA:longitude}, alt:%{DATA:altitude}" }
    }

    if "position" in [tags]
    {
       mutate {
         add_field => { "[location][lat]" => "%{latitude}" }
         add_field => { "[location][lon]" => "%{longitude}" }
       }

       mutate {
         convert => {
           "[location][lat]" => "float"
           "[location][lon]" => "float" 
        }
       }
   }

    mutate {
           convert =>{
                   "latitude" => "float"
                   "longitude" => "float"
                   "altitude" => "float"
                   "delta" => "float"
           }
    }
}

output { 
       if "position" in [tags] and "myproject" in [tags] 
       {

         elasticsearch
         {
                index => "myproject-%{+YYYY.MM.dd}"
               manage_template => "false"
               template => "/etc/logstash/templates/myproject-elasticsearch.json"
         }

         file {
            path => "/opt/cyberquest/position.txt"
         }

       }
       else if "heartbeat" in [tags] and "thunderstorm" in [tags]
       {

         elasticsearch
         {
                index => "myproject-%{+YYYY.MM.dd}"
         }

         file {
            path => "/opt/cyberquest/heartbeat.txt"
         }

       }
}

Which resulted in the following JSON object for a position:

{
  "_index": "myproject-2016.06.20",
  "_type": "logs",
  "_id": "AVVt60EVrNq0NqkairlO",
  "_score": null,
  "_source": {
    "message": "06/20/2016 09:06:21 Position: name:SampleDFDevice, lat:34.11946855074746, lon:-81.92403913044855, alt:1000.0",
    "@version": "1",
    "@timestamp": "2016-06-20T13:06:21.507Z",
    "path": "/opt/event/mylog.txt",
    "host": "localhost.localdomain",
    "date": "06/20/2016",
    "time": "09:06:21",
    "name": "SampleDFDevice",
    "latitude": 34.11946855074746,
    "longitude": -81.92403913044855,
    "tags": [
      "myproject",
      "position",
      "_grokparsefailure"
    ],
    "location": {
      "lat": 34.11946855074746,
      "lon": -81.92403913044855
    }
  },
  "fields": {
    "@timestamp": [
      1466427981507
    ]
  },
  "sort": [
    1466427981507
  ]
}

using the elasticsearch template of:

    {
        "location_mapping": {
            "my_type": {
                "properties": {
                    "location": {
                        "type": "geo_point",
                        "geohash_prefix": true
                    }
                }
            }
        }
    }

When I go into Kibana and do:

  1. Click on Visualize
  2. Select the myproject from the search source
  3. Click on Geo Coordinates
  4. Select Geo Coordinates as the bucket type

I see Aggregation say "Geohash" and below that I see a error message:

No Compatible Fields: The "myproject-*" index pattern does not contain any of the following field types: geo_point.

This is why I have been posting here. I want to be able to plot the geo points on a map.

Right I am not sure my template is being loaded and matching effectively. How do you double check a template is properly formatted and it is matching?

Is it the template above?

No. As I read again the documentation for logstash and read a forum (Add Geopoint based off of parsed value to logstash config) I copied the logstash elasticsearch output plugin template and attempted to modify it for my needs.

Here is my template:

{
  "template" : "myproject-*",
  "settings" : {
    "index.refresh_interval" : "5s"
  },
  "mappings" : {
    "_default_" : {
      "_all" : {"enabled" : true, "omit_norms" : true},
      "dynamic_templates" : [ {
        "message_field" : {
          "match" : "message",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" }
          }
        }
      }, {
        "string_fields" : {
          "match" : "*",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" },
            "fields" : {
              "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
            }
          }
        }
      } ],
      "properties" : {
        "@timestamp": { "type": "date" },
        "@version": { "type": "string", "index": "not_analyzed" },
        "geoip"  : {
          "dynamic": true,
          "properties" : {
            "ip": { "type": "ip" },
            "location" : { "type" : "geo_point" },
            "latitude" : { "type" : "float" },
            "longitude" : { "type" : "float" }
          }
        },
	"location" : { "type": "geo_point" }
      }
    }
  }
}

When I look at the geo point documentation (https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html) is see that the json for the commands to set a geo point appear different than what I see in my output. I see:

"location": {
      "lat": 33.48414869262067,
      "lon": -81.8800662624854
    }

when the examples seem to have extra curly braces:

{
"location": {
      "lat": 33.48414869262067,
      "lon": -81.8800662624854
    }
} 

Do the extra braces matter?

Did you succeed in getting this work?
I'm about to try my hand at converting lat/lon information into geo_point for Kibana visualization.

We got something to work but I don't recall what we did. It has been two months since I looked at it.

Thanks for checking back in
I did get it to work.
Works for me like this:

PUT /alex-locations-a/v1/1
{
  "text" : "a point",
  "@timestamp" : 1476415349950,
  "na-location" : {
    "lat" : 41.12,
    "lon" : -71.34
  }
}

PUT /alex-locations-a/v1/2
{
  "text" : "another point",
  "@timestamp" : 1476415349951,
  "na-location" : {
    "lat" : 42.12,
    "lon" : -71.34
  }
}

with this mapping:
PUT _template/alex-locations
{
  "template": "alex-locations*",
  "settings": {},
  "mappings": {
    "_default_": {
      "properties": {
        "na-location": {
          "type": "geo_point"
        },
        "@timestamp": {
          "format": "strict_date_optional_time||epoch_millis",
          "type": "date"
        }
      }
    }
  }
}
4 Likes

@iamthealex I tried out your example with the Kibana Console and it works like a charm.
Because of your post I understand now completely. Many thanks! :grin: