Problems while uploading Geojson data in Kibana Maps (Elastic Cloud)

Hi!
First of all, thanks to anyone who can help me with this.
I had some problems trying to add some GEOJSON data in Kibana Maps.
The problems is that when I try to upload a file it keeps charging and charging but nothing happens and even after an hour it keeps doing this process without any error, result or any kind of alert (illustration 1).
It is important to say that I am working with the Kibana deployment (v7.6.2) of Elastic Cloud, nevertheless when i try to upload the SAME FILE in my Kibana (localhost:5601) it works perfectly. (illustration 2).
Can someone help me with this, does anyone know how to solve it please?

Additional info:

  • File weight (7037Kb)
  • Type: Point
  • Number of features: 3455 points


Upload Error

File working in localhost:5601

Hi @Pato_Loza,

Welcome to our community! Is there any browser console error when you see this issue on cloud?

Thanks,
Liza

Thanks for answering me Liza!
No, no errors are displayed in the browser console nor in Kibana.

Let me tell you that I tried with another lighter geojson file, and it worked, but with this specific one it didn't.
I am wondering if it is because i am using a free trial of Elastic Cloud!. Are there restricions like this in the free trial or not?
Please, let me know if you think that could be another reason for this error!

Thanks @Pato_Loza,

Let me see if one of our Kibana maps experts @thomasneirynck can advise what to check next before we route to Cloud. There should not be a restriction on this in Cloud trial, but I could be wrong, it is interesting that you tried same file on-prem and it worked, are both your cloud and on-prem using the same version 7.6.2?

Regards,
Liza

In my localhost:5601 I am using Kibana v7.6.1 and in cloud v7.6.2. Additionally, I tried to solve the problem changing the deployment version in Elastic Cloud from v7.6.2 to v 7.5.2 but i didn't work too.

Thanks Liza!

Hi @Pato_Loza,

Would it be possible to share the geojson file for use to take a look? Without errors in the browser console, it is hard to guess what may be happening, and it will definitely be faster than having to search for the cloud logs.

Have you tried in your localhost with 7.6.2?

2 Likes

Thanks for your help @jsanz!, i just sent you the aforementioned geojson through drive link in a message.

No, i haven't but let me try

Thanks,

@jsanz i tried in localhost with kibana v7.6.2 and it worked!

I found https://www.elastic.co/blog/how-to-ingest-geospatial-data-into-elasticsearch-with-gdal to be a more reliable way to load spatial data into Elasticsearch. The Kibana UI limits the size geojson files you can upload (and requires a pre-processing step if your start point is a shapefile).

Follow the instructions on https://github.com/OSGeo/gdal/tree/master/gdal/docker to get a version of gdal that works with Elastic (I'm using osgeo/gdal:alpine-normal-latest)

Incidentally, I've found Kibana Maps on Elastic Cloud to be flakey. I've seen similar errors to the one in your screenshot when the Elastic Cloud Kibana instance crashes.

Hi @Pato_Loza,

Can you also send me the file as you did for @jsanz? I can try it as well to see if I can reproduce. Unless the file contains sensitive information, then I understand.

Thanks,
Liza

@Pato_Loza I could load your datasets into our cloud (7.6.2) without any issue:

I can only assume computing power as the difference between environments. What are your cloud deployment specs? the second dataset has rather large polygons, you may want to simplify them, remove unused fields, etc. In our experience mapshaper is a nice tool (with a cli version) for this kind of boundary layers since it retains the topology of the polygons.

Also, as @pgoldtho mentioned, you may prefer to use ogr2ogr for big datasets, since you have much better control of your mapping, dozens of input formats, etc.

1 Like

@pgoldtho thanks for answering, it is good to know that I am not the only one who have this error. Do you know how can use GDAL in Elastic Cloud?. I was checking out the links that you posted bellow, but i don't really know how used it in order to upload data into Elastic Cloud. Or maybe @jsanz could help me with this?
FInally, answering your question I am using the recommended specs of an "I/O Optimized" deployment.
image
Thanks!,

That deployment is perfectly capable of managing this and way bigger datasets. Since this seems an issue with our cloud service, I suggest you to open a support ticket, details here. Please include this post for reference.


Regarding using ogr2ogr, the blog post explains how to use the tool. You have the Elasticsearch endpoint in the deployment interface.

The thing is that the current stable release of the tool is still not compatible with Elasticsearch 7 yet. The easiest way to execute the tool at this moment is by using a Docker image, but you need some "fluency" with both tools to debug any issues you may find.

The workflow I would use is:

  1. Clean and optionally simplify the file:
$ mapshaper -i DPA_CANTONAL_S.geojson \
 -clean -verbose -simplify "80%" \
 -o DPA_CANTONAL_S.clean.geojson
  1. Upload to your cluster given a ES_URL, USER, and PASSWORD assuming the file is in the working directory
$ docker run --rm -u $(id -u ${USER}):$(id -g ${USER}) \                                                                              
-v $(pwd):/data \
osgeo/gdal:alpine-small-latest \
ogr2ogr -nln cantonal_test -f Elasticsearch \
    "https://USER:PASSWORD@ES_URL" \
    /data/DPA_CANTONAL_S.clean.geojson
  1. Check the index cantonal_test has been generated and populated
$ curl "https://USER:PASSWORD@ES_URL/_cat/indices/cantonal*?v"
health status index         uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   cantonal_test bNy5iTM6SHujMAm2h9t2zA   1   1        221            0       10mb           10mb
  1. Load it in Maps

Hope it helps

You'll need to install docker (https://docs.docker.com/get-docker/) if don't already have it. Then run docker pull osgeo/gdal:alpine-normal-latest to download a gdal image. You only need to run the docker pull command once.

After that you can run something like this to upload your data:

docker run --rm -v /Users:/Users osgeo/gdal:alpine-normal-latest ogr2ogr -skipfailures \
ES:https://elastic:es-password@cloudinstance.us-central1.gcp.cloud.es.io:9243 \
$PWD/FL_Wetlands.shp

This assumes you are using a Mac. Change -v /Users:/Users to -v /home/home on Linux. Haven't tried this on Windows

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.