Hi!
First of all, thanks to anyone who can help me with this.
I had some problems trying to add some GEOJSON data in Kibana Maps.
The problems is that when I try to upload a file it keeps charging and charging but nothing happens and even after an hour it keeps doing this process without any error, result or any kind of alert (illustration 1).
It is important to say that I am working with the Kibana deployment (v7.6.2) of Elastic Cloud, nevertheless when i try to upload the SAME FILE in my Kibana (localhost:5601) it works perfectly. (illustration 2).
Can someone help me with this, does anyone know how to solve it please?
Thanks for answering me Liza!
No, no errors are displayed in the browser console nor in Kibana.
Let me tell you that I tried with another lighter geojson file, and it worked, but with this specific one it didn't.
I am wondering if it is because i am using a free trial of Elastic Cloud!. Are there restricions like this in the free trial or not?
Please, let me know if you think that could be another reason for this error!
Let me see if one of our Kibana maps experts @thomasneirynck can advise what to check next before we route to Cloud. There should not be a restriction on this in Cloud trial, but I could be wrong, it is interesting that you tried same file on-prem and it worked, are both your cloud and on-prem using the same version 7.6.2?
In my localhost:5601 I am using Kibana v7.6.1 and in cloud v7.6.2. Additionally, I tried to solve the problem changing the deployment version in Elastic Cloud from v7.6.2 to v 7.5.2 but i didn't work too.
Would it be possible to share the geojson file for use to take a look? Without errors in the browser console, it is hard to guess what may be happening, and it will definitely be faster than having to search for the cloud logs.
Incidentally, I've found Kibana Maps on Elastic Cloud to be flakey. I've seen similar errors to the one in your screenshot when the Elastic Cloud Kibana instance crashes.
Can you also send me the file as you did for @jsanz? I can try it as well to see if I can reproduce. Unless the file contains sensitive information, then I understand.
I can only assume computing power as the difference between environments. What are your cloud deployment specs? the second dataset has rather large polygons, you may want to simplify them, remove unused fields, etc. In our experience mapshaper is a nice tool (with a cli version) for this kind of boundary layers since it retains the topology of the polygons.
Also, as @pgoldtho mentioned, you may prefer to use ogr2ogr for big datasets, since you have much better control of your mapping, dozens of input formats, etc.
@pgoldtho thanks for answering, it is good to know that I am not the only one who have this error. Do you know how can use GDAL in Elastic Cloud?. I was checking out the links that you posted bellow, but i don't really know how used it in order to upload data into Elastic Cloud. Or maybe @jsanz could help me with this?
FInally, answering your question I am using the recommended specs of an "I/O Optimized" deployment.
That deployment is perfectly capable of managing this and way bigger datasets. Since this seems an issue with our cloud service, I suggest you to open a support ticket, details here. Please include this post for reference.
Regarding using ogr2ogr, the blog post explains how to use the tool. You have the Elasticsearch endpoint in the deployment interface.
The thing is that the current stable release of the tool is still not compatible with Elasticsearch 7 yet. The easiest way to execute the tool at this moment is by using a Docker image, but you need some "fluency" with both tools to debug any issues you may find.
Check the index cantonal_test has been generated and populated
$ curl "https://USER:PASSWORD@ES_URL/_cat/indices/cantonal*?v"
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open cantonal_test bNy5iTM6SHujMAm2h9t2zA 1 1 221 0 10mb 10mb
You'll need to install docker (https://docs.docker.com/get-docker/) if don't already have it. Then run docker pull osgeo/gdal:alpine-normal-latest to download a gdal image. You only need to run the docker pull command once.
After that you can run something like this to upload your data:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.