We are currently working on ingesting multiple polygons from our MySQL database into Elasticsearch to be visualized in Kibana Maps. We have tried generating a geojson from the polygons we need but we encountered errors when trying to visualize. Is there a way to format or filter before ingestion in Logstash? Or do we need to look for another route to ingest our polygons? Thanks.
Hi @edbautista. Can you share which errors you received when trying to visualize?
If the existing Logstash filters are insufficient for this, you can write a custom filter using ruby.
{
"success": true,
"failures": [
{
"item": 3,
"reason": "failed to parse field [coordinates] of type [geo_shape]",
"doc": {
"coordinates": {
"type": "polygon",
"coordinates": [
[
[
125.13984323575562,
6.133886657363985
],
[
125.13984323575562,
6.133886657363985
], ....
Because we don't know how to ingest geo_shape using logstash from mysql, we opt to generate a geojson file. When we ingested the geojson file via Kibana Maps, it gave us that failure message. We made a workaround for this using our own upload tool and made it worked. However, we would still want to know how to ingest geo_shape using logstash. Thanks @nickpeihl.
Would you be willing to share the entire geojson file with us so we can debug it in Maps?
There does not appear to be a native way for Logstash to convert a geojson feature from MySQL to a geo_shape in Elasticsearch. So you'll have to make our own Logstash filter. Perhaps I can offer some ideas. Can you also share one of the documents as it was stored in Elasticsearch?