Choropleth map from a spatial query?

I have 2 indices: first, static with custom geometries (geo_shape) and second is time-based stream of logs with coordinates (geo_point).

I need to show the custom geometries from the 1st index in a color palette that represents how many documents of the second index fall inside.

I was hoping for some native functionality would be available, but I can only see "Term Joins" dialog, when adding a layer to a map. Howver I cannot join by attribute. The only existing relation between the 2 indices i a spatial relation.
https://www.elastic.co/guide/en/kibana/current/maps-add-choropleth-layer.html

Is there any workaround on how to do such visualization in Kibana?

Otherwise I guess I have to implement some custom Logstash filter in order to match by an attribute. Or maybe it could be achieved with a lookup filter.

The only existing relation between the 2 indices i a spatial relation

There's no UX in Kibana Maps to do this directly, but you can make this happen by using the Elasticsearch enrichment processor.

Basically, what you will be doing is add a field to create a pipeline in Elasticsearch using the geo-match enrichment policy. There is a tutorial in the docs here: Example: Enrich your data based on geolocation | Elasticsearch Guide [8.11] | Elastic

This functionality is equivalent to doing a "spatial join". It uses the "intersects" spatial relationship

The advantage of this approach is that one this pipeline is setup, it will enrich your documents from index2 (which is a time-based stream of logs) as they get indexed in Elasticsearch.

You can then use Kibana Maps to create the choropleth map.

  • Add index1 with geo_shape field as a layer to the map.
  • Do a term-join on that shared attribute that you computed with the enrichment policy.
  • Style by-value for fill-color on your metric
4 Likes

Thanks Thomas,

Thanks, I didn't knew _enrich.
Since we have our processing in Logstash, I wonder it is a good idea to use the Logstash lookup filtrer for the purpose - the expected volume is 20 events/sec.

I guess using Elasticsearch pipeline is much more optimized because it has the data locally. However it feels bad having to split up the processing in two places..

Thanks

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.