Grouping GPS points into custom GeoJSON sections

Hello all!

Wondering if someone can shed some light on whether I'm trying to do the impossible in Kibana :slight_smile:

I'll use example images for the purposes of this post.

The following image shows a number of geographical locations across London. These are contained within an index (we'll call this 'locations' index) as a number of documents, each with their own geo_point field containing the location.

Similar to what has been demonstrated here (https://blog.mimacom.com/custom-region-map/), I want to portray these locations as a custom region map in Kibana.

As per the blog post, I've created my own geoJSON file using http://geojson.io/

The geoJSON file contains 4 regions within London, and once uploaded to Kibana (within a new map visualisation), it looks like this -

By uploading the geoJSON file, it automatically creates an index containing the geo-points that make up the 4 regions. We'll call this index 'regions'.

My question is, is it possible to join the 'locations' index with the 'regions' index somehow, and change the colour of each region based on the number of locations contained within it? For example, in my example it's clear the top right region contains the most locations, so this would be a darker red. The lower left region containing the least would be yellow (colours don't matter, it's the scaling I'm interesting in).

My worry is that I can only join the two indexes based on terms, but I don't have any terms defined in the geoJSON file, and therefore there are no terms defined in the regions index.

Any thoughts/ideas aere very welcome!

Thanks

Hi Ben.

You can add terms from the region to your GPS points with an enrich processor.

Using this method, you would first upload your regions GeoJSON into an Elasticsearch index. Then set up the geo_match enrich policy to use the regions index with enrich fields (such as region id). You then ingest your GPS points with a pipeline that uses the geo_match enrich policy you set up. There's a full example here.

Now that your GPS points have a field matching the region (for example, region_id) you can use a Choropleth layer in Elastic Maps to aggregate the GPS points by region_id and join the results of that aggregation to your regions and style appropriately.

Let me know if anything is unclear.

1 Like

Hi Nick,

Thanks for coming back to me, fantastic that there's a solution! Everything seems clear, however, in trying to implement the solution, it appears I must be missing something. I'm following the example, making the necessary changes to reflect my index names etc, but for some reason the end result does not contain the enriched data.

Firstly, I actually added a custom "ID" field when creating the geoJSON file, so my regions are labelled as followed -

8B2895D0-88B1-4274-ACF4-C561551467AD

I called this index "london-map", here's an image of the ingested geoJSON file as an index (I've highlighted the custom ID field -

I then worked through the example, starting with creating an enrichment policy (as I already had a source index with documents) -

PUT /_enrich/policy/regionid_policy
{
"geo_match": {
"indices": "london-map",
"match_field": "coordinates",
"enrich_fields": [ "ID" ]
}
}

I then executed the enrichment policy -

POST /_enrich/policy/regionid_policy/_execute

Then I created the ingest pipeline -

PUT /_ingest/pipeline/regionid_lookup
{
"description": "Enrich Region IDs",
"processors": [
{
"enrich": {
"policy_name": "regionid_policy",
"field": "home_gps",
"target_field": "region_id",
"shape_relation": "INTERSECTS"
}
}
]
}

I initially tried the INTERSECT shape relation, but I also tried the WITHIN and CONTAINS options. They didnt work either, in fact the CONTAINS option gave me an error when indexing the test document.

Finally I indexed my test document to make sure it works -

PUT /users/_doc/0?pipeline=regionid_lookup
{
"first_name": "Test",
"last_name": "document",
"home_gps": "POINT (51.62897 -0.038394)"
}

I ensured the lat/long format matched that in the example, which meant removing the comma after the lat value.

I then ran "GET /users/_doc/0" to verify the enrichment has happened , and get back the folllowing -

{
"_index" : "users",
"_type" : "_doc",
"_id" : "0",
"_version" : 3,
"_seq_no" : 2,
"_primary_term" : 1,
"found" : true,
"_source" : {
"home_gps" : "POINT (51.62897 -0.038394)",
"last_name" : "document",
"first_name" : "Test"
}
}

As you can see, the enrichment doesn't apear to have worked as the ID field has not been added to the document! Have I missed something?

Any help is as always greatly appreciated. Many thanks!

Figured it out :slight_smile:

I ran through the example word for word, and it all worked fine, so it had to have been something about my setup.

I replaced my generated geoJSON file with a manual setup of the index using the PUT commands in the example. I also chose to go with the 'envelope' type geo_shape option, and this seems to be a simpler shape than a polygon. After some faffing around trying to work out the correct order of lat-lon values to use, I finally managed to upload it ok using the following command -

PUT /region_ids/_doc/1?refresh=wait_for
{
"location": {
"type": "envelope",
"coordinates": [ [ -0.46554565429687494, 51.64614620689681 ], [ -0.11260986328124999, 51.46427482966439 ] ]
},
"region_id": "01"
}

Once I ran through the rest of the exercise, and enriched my document, I got the desired outcome -

{
"_index" : "users",
"_type" : "_doc",
"_id" : "0",
"_version" : 5,
"_seq_no" : 4,
"_primary_term" : 1,
"found" : true,
"_source" : {
"home_gps" : "POINT (-0.31036376953125 51.5463350479341)",
"last_name" : "Brown",
"geo_data" : {
"region_id" : "01",
"location" : {
"coordinates" : [
[
-0.46554565429687494,
51.64614620689681
],
[
-0.11260986328124999,
51.46427482966439
]
],
"type" : "envelope"
}
},
"first_name" : "Mardy"
}
}

Thanks again for your help!

2 Likes

That's fantastic @benji87.

Just a note for future discussions, it's much easier for us to read your JSON when you surround it by three backticks. ```

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.