Correct method for geospatial mapping from postgres -> logstash -> Elasticsearch


(Freddie) #1

I'm getting the following error, on the mapping template

            "boundingArea": {
                "type": "geo_shape",
                "tree": "quadtree",
                "precision": "1m"
            },

Does anyone know what I'm doing wrong?

Thanks

[2015-08-06 17:12:14,839][DEBUG][action.bulk ] [gmti query] [gmti_y2008-12_v1][0] failed to execute bulk item (index) index {[gmti_y2008-12_v1][gmti_mission][AU8E23Lps3sYXUFDKKMx], source[{"missionUid":1,"packet_header_number":1,"packet_segment_number":1,"missionId":"2008DEC31USGHAWK1_000095","stanagMissionId":95,"nationality":"US","platformId":"HAWK1","freeText":null,"flightPlan":"east3.nav","missionPlan":"","platformType":{"type":"json","value":"{"name":"GLOBAL","enum":20}"},"platformConfig":"GH-SIM","imageId":null,"sic":null,"countryCode":null,"targetBE":null,"targetId":null,"classification":{"type":"json","value":"{"name":"UNCLASSIFIED","enum":5}"},"codeword":0,"filename":"200812/HAWK1/kirking_20081231.4607","startTime":"2008-12-31T10:00:22.809","stopTime":"2008-12-31T17:59:57.072","created":"2015-05-26T17:07:01.454","createdBy":"Stanag4670 V1","exerciseIndicator":null,"missionState":{"type":"json","value":"{"name":"ENABLED","enum":0}"},"fileSize":8784670,"numberOfDots":44869,"numberOfClusters":null,"boundingArea":"{"type":"Polygon","coordinates":[[[0.609260818798618,0.766326292040311],[0.609260818798618,0.784287693004756],[0.62406440137011,0.784287693004756],[0.62406440137011,0.766326292040311],[0.609260818798618,0.766326292040311]]]}","classificationSystem":"US","startTimeOrig":"2008-12-31T10:00:22.809","stopTimeOrig":"2008-12-31T17:59:57.072","dotMask":0,"coincidentMask":0,"sanitized_file_name":null,"numberOfRestrictions":null}]}
org.elasticsearch.action.WriteFailureException
at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:470)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:418)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to parse [boundingArea]
at org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper.parse(GeoShapeFieldMapper.java:263)
at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:706)
at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:497)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:544)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:493)
at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:466)
... 8 more
Caused by: org.elasticsearch.ElasticsearchParseException: Shape must be an object consisting of type and coordinates
at org.elasticsearch.common.geo.builders.ShapeBuilder$GeoShapeType.parse(ShapeBuilder.java:725)
at org.elasticsearch.common.geo.builders.ShapeBuilder.parse(ShapeBuilder.java:293)
at org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper.parse(GeoShapeFieldMapper.java:244)
... 13 more


Ingesting Geospatial data with Logstash
(Mark Walkom) #2

Bounding shapes need to have the start and end values the same, see the docs for details :slight_smile:


(system) #3