I've tried several times to load a geojson file into a index through Kibana import system. It gives me the preview of the polygons, creates a index, but the it returns me an error
"Error performing fetch: Request Entity Too Large"
with a doc_count of 0.
I've tried also with a single polygon and the problem was the same. So I guess it a geojson formatting issue. Could you help me with some kind of valid template that I can use?
Hello @Hugo_Pires. I'd also be interested in what file you're using. My offhand guess would be the file contains some very large features, maybe higher than 30MB or so. We don't see that too often but it does happen. If that's the case, you could try simplifying the file using a utility like https://mapshaper.org/.
I also didn't have any issues loading the file. It looks like the error you're getting is an Elasticsearch-specific error. A few thoughts:
You could check in the Elasticsearch forum since it is an Elasticsearch error. I suspect they'll first give you the general advice for this error which is to set an increased value on http.max_content_length. The default is 100MB so that doesn't seem like it would be the issue. With that said, you should certainly give it a shot first so you can say you've tried it. You can find more info here
If it isn't too much trouble, you might try upgrading your Elasticsearch instance to 7.7.1 on the off chance it's a bug that was fixed.
If neither of the first two options work out, you should consider submitting a bug. There's always the possibility someone else is seeing the same thing whether we can reproduce it or not.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.