Coordinate Map - No Data

Hi, I am trying to create a coordinate map based off a geo_point. However, no locations are being shown on my map. To get to this point I first created an index with a mapping:

PUT cm_delivery_locations
{
  "settings" : {
    "index" : {
      "number_of_shards" : 1,
      "number_of_replicas" : 0
    }
  },
  "mappings" : {
    "properties" : {
      "delivery_postcode": { "type": "text" },
      "delivery_date": { "type": "date" },
      "delivery_location" : {
        "dynamic" : true,
        "properties" : {
          "location" : { "type" : "geo_point" },
          "latitude" : { "type" : "half_float" },
          "longitude" : { "type" : "half_float" }
        }
      }
    }
  },
  "aliases": { ".cm" : {} }
}

Then using Logstash I input the data. Here is a snippet of the log stash config

filter { 
    mutate {
      add_field => { "[delivery_location][latitude]" => "%{[latlong][0]}" }
      add_field => { "[delivery_location][lonitude]" => "%{[latlong][1]}" }
     }

    mutate {
      convert => { "[delivery_location][latitude]" => "float" }
      convert => { "[delivery_location][longitude]" => "float" }
    }
  }
}

Here is what Logstash puts into ES:

{
    "delivery_date" => 2018-09-16T23:00:00.000Z,
    "delivery_postcode" => "XXX XXX",
    "delivery_location" => {
        "lat" => XX.XXXXXXXXXXX,
        "lon" => -X.XXXXXXXXXXX
}

And the Logstash output:

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "cm_delivery_locations"
    manage_template => true
    template => "/etc/elasticsearch/templates/cm_delivery_locations.json"
    template_name => "cm_delivery_locations"
    template_overwrite => "true"
  }
  stdout { codec => rubydebug }
}

In Kibana I have created an index pattern, which shows 'delivery_location.location' as type geo_point. Discover shows the correct lat/long coordinates on search. Under visualisations, when I add a new visualisation, I am able to select 'delivery_location.location' as a Geohash (aggregation). However, when clicking the play button nothing shows on the map. Is there a way to see what is in the geo_point location field? Or have I missed something obvious?

Thanks,

Adam

@ads. Can you click on the inspect tab on the map and see what it shows there? It should show you some data points in a tabular format. If it does not, there is a request tab that gives a snapshot of the query that ran and if there were any errors.

Screen Shot 2020-03-07 at 3.52.36 PM

Screen Shot 2020-03-07 at 3.52.56 PM

When setting up your Kibana index pattern, did you set the Time Filter field name? What field did you specify? What time range is displayed in Kibana? Is the time filter excluding all documents? From the sample document provided, it looks like your timestamp is from 2018-09-16T23:00:00.000Z. Kibana defaults to last 15 minutes so I would suspect this may be the case of there not being any data displayed.

Also, why are using coordinate map? Have you tried the new maps application, https://www.elastic.co/guide/en/kibana/7.6/maps.html

Great, thats helpful. The request is:

{
  "aggs": {
    "filter_agg": {
      "filter": {
        "geo_bounding_box": {
          "ignore_unmapped": true,
          "delivery_location.location": {
            "top_left": {
              "lat": 56.719829999999995,
              "lon": -12.293700000000001
            },
            "bottom_right": {
              "lat": 46.438790000000004,
              "lon": 10.77759
            }
          }
        }
      },
      "aggs": {
        "2": {
          "geohash_grid": {
            "field": "delivery_location.location",
            "precision": 4
          },
          "aggs": {
            "3": {
              "geo_centroid": {
                "field": "delivery_location.location"
              }
            }
          }
        }
      }
    }
  },
  "size": 0,
  "stored_fields": [
    "*"
  ],
  "script_fields": {},
  "docvalue_fields": [
    {
      "field": "@timestamp",
      "format": "date_time"
    },
    {
      "field": "delivery_date",
      "format": "date_time"
    }
  ],
  "_source": {
    "excludes": []
  },
  "query": {
    "bool": {
      "must": [],
      "filter": [
        {
          "match_all": {}
        },
        {
          "match_all": {}
        },
        {
          "range": {
            "delivery_date": {
              "gte": "2017-01-01T00:00:00.000Z",
              "lte": "2019-12-31T23:30:00.000Z",
              "format": "strict_date_optional_time"
            }
          }
        }
      ],
      "should": [],
      "must_not": []
    }
  }
}

And the response is:

{
  "took": 2,
  "timed_out": false,
  "_shards": {
    "total": 1,
    "successful": 1,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 772,
    "max_score": null,
    "hits": []
  },
  "aggregations": {
    "filter_agg": {
      "2": {
        "buckets": []
      },
      "doc_count": 0
    }
  }
}

If I'm interpreting this correctly, its saying that there were 772 matches but no data in them?

I did try using the maps application, with the same results when adding a Grid Aggregation layer. Looking at it again, using inspect, my hits total is returning 0. The other thing is, the non-geoip examples I found online use coordinate maps, so I thought I would try that approach. Happy to take troubleshooting steps using the maps application instead.

Thanks,

What examples are you following online? Curious what resources are out there.

What does your elasticsearch mapping look like? Can you provide a sample document?
Can you try a document layer in the maps application? Do those show up?

I've used a range of resources to try and figure this out. The most useful has been:

I've not managed to find an example of someone creating a mapping for ES 7.x. The resources above mostly show mappings that use types and the _default_ parameter, which are now obsolete. It likely that my google-fu is rusty, so apologies if anyone reading this has written an article using 7.x, I've just not managed to find it yet.

I have spent a long time looking at Logstash as I thought this is where the problem lay. Now that the latitude/longitude data is available in discover, I wonder whether the issue is related to the index mapping or my understanding of how to generate a geo_point from Logstash.

My Elasticsearch mapping, if I've understood correctly what a mapping is, is as per my first post:

{
  "settings" : {
    "index" : {
      "number_of_shards" : "1",
      "refresh_interval" : "5s"
    }
  },
  "index_patterns": [ "cm_delivery_locations" ],
  "mappings" : {
    "properties" : {
      "delivery_postcode": { "type": "text" },
      "delivery_date": { "type": "date" },
      "delivery_location" : {
        "dynamic" : true,
        "properties" : {
          "location" : { "type" : "geo_point" },
          "latitude" : { "type" : "half_float" },        
          "longitude" : { "type" : "half_float" }
        }
      }
    }
  }
}

If I perform a document layer in Maps, Inspect shows the following request:

{
  "docvalue_fields": [
    "delivery_location.location"
  ],
  "size": 10000,
  "_source": false,
  "stored_fields": [
    "delivery_location.location"
  ],
  "script_fields": {},
  "query": {
    "bool": {
      "must": [],
      "filter": [
        {
          "match_all": {}
        },
        {
          "range": {
            "delivery_date": {
              "gte": "2017-01-01T00:00:00.000Z",
              "lte": "2019-12-31T23:30:00.000Z",
              "format": "strict_date_optional_time"
            }
          }
        }
      ],
      "should": [],
      "must_not": []
    }
  }
}

And this response:

{
  "took": 38,
  "timed_out": false,
  "_shards": {
    "total": 1,
    "successful": 1,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 772,
    "max_score": 0,
    "hits": [
      {
        "_index": "cm_delivery_locations",
        "_type": "_doc",
        "_id": "WeDtuXABDb_U0HpUp96o",
        "_score": 0
      },
      {
        "_index": "cm_delivery_locations",
        "_type": "_doc",
        "_id": "SODtuXABDb_U0HpUp96o",
        "_score": 0
      },
      {

... cut data ...

      }
    ]
  }
}

There are no dots shown on the roadmap. Does this mean that the geo_point coordinates are not being returned correctly? Should I expect to see Lat/Long data in the hits array?

Thanks,

Looks like your documents are not populating delivery_location.location. Can you run the following command in Kibana dev tools. What do your documents look like?

GET cm_delivery_locations/_search
{
 "size": 10
}

I get the following:

{
  "took" : 0,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 1056,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "cm_delivery_locations",
        "_type" : "_doc",
        "_id" : "WODtuXABDb_U0HpUSNqT",
        "_score" : 1.0,
        "_source" : {
          "delivery_postcode" : "xxx xxx",
          "delivery_date" : "2017-03-29T23:00:00.000Z",
          "delivery_location" : {
            "longitude" : -1.xxxxxxxxxxxxxx,
            "latitude" : 52.xxxxxxxxxxxxxx
          }
        }
      },
      {
        "_index" : "cm_delivery_locations",
        "_type" : "_doc",
        "_id" : "WeDtuXABDb_U0HpUSNqT",
        "_score" : 1.0,
        "_source" : {
          "delivery_postcode" : "xxx xx",
          "delivery_date" : "2016-03-29T23:00:00.000Z",
          "delivery_location" : {
            "longitude" : -1.xxxxxxxxxxxxxx,
            "latitude" : 51.xxxxxxxxxxxxxx
          }
        }
      },

... cut other 8 results ...

      }
    ]
  }
}

I see the problem in your data.

delivery_location.location is not populated so there is nothing to map. You need to update your ingest process to ensure delivery_location.location is set.

How do I do that? Is it to do with the configuration in Logstash? Are there some docs you can point me at? Thanks

Where is the data coming from? Have you tried the logstash stdout plugin to view the documents before they get inserted into Elasticsearch. This is a great way to see what the documents look like and debug why they are different then expected.

That's what I'm using and it shows this as its output:

{
    "delivery_date" => 2018-09-16T23:00:00.000Z,
    "delivery_postcode" => "XXX XXX",
    "delivery_location" => {
        "lat" => 51.XXXXXXXXXXX,
        "lon" => -1.XXXXXXXXXXX
}

I guess I had assumed from this that the location geo_point would be auto populated. However, that doesn't seem to be the case, I'm just not sure what I'm missing.

You need to update your filter expression to set location. Try something like the below

filter { 
    mutate {
      add_field => { "[delivery_location][latitude]" => "%{[latlong][0]}" }
      add_field => { "[delivery_location][lonitude]" => "%{[latlong][1]}" }
     }

    mutate {
      convert => { "[delivery_location][latitude]" => "float" }
      convert => { "[delivery_location][longitude]" => "float" }
    }

    mutate {
      add_field => { "[delivery_location][location][lat]" => "%{[delivery_location][latitude]}" }
      add_field => { "[delivery_location][location][lon]" => "%{[delivery_location][longitude]}" }
     }
  }
}

Thank you Nathan, that was it!!

If I am interpreting this correctly, LAT/LON has to be a nested indicie within 'location', which is a specific ES construct?

If I am interpreting this correctly, LAT/LON has to be a nested indicie within 'location', which is a specific ES construct?

There are several ways to define lat and lon for geo_points. One way is an object with lat and lon properties. Others are listed at https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.