How can I generate lat long fields into geo point in elasticsearch

0

I've ingested CSV data via pgsync from postgres RDS to elasticsearch.my index contains "lat" and "lng"

"lat" : {
      "type" : "float"
    },
    "lng" : {
      "type" : "float"

How would I convert this into an acceptable geopoint format so that I can map it in Kibana?

I already add this mapping:

    ``` PUT /my-index/_mapping
{
  "properties": {
    "location": {
      "type": "geo_point"
    }
  }
}```

but when I'm trying to generate new coordinate field via:

``` POST your_index/_update_by_query
{
  "script": {
    "inline": "ctx._source.location = ctx._source.lat, ctx._source.lon",
    "lang": "painless"
  },
  "query": {
    "match_all": {}
  }
}```





I am unable and getting this error

'''
{
  "error" : {
    "root_cause" : [
      {
        "type" : "script_exception",
        "reason" : "compile error",
        "script_stack" : [
          "... ocation = ctx._source.lat, ctx._source.lng",
          "                             ^---- HERE"
        ],
        "script" : "ctx._source.location = ctx._source.lat, ctx._source.lng",
        "lang" : "painless",
        "position" : {
          "offset" : 38,
          "start" : 13,
          "end" : 55
        }
      }
'''

Is there any suggestion or correction for this?

Careful with your lat and lon / lng and make sure you get the coordinates in the correct order see here

DELETE discuss-location-index
PUT discuss-location-index
{
  "mappings": {
    "properties": {
      "lat": {
        "type": "float"
      },
      "lon": {
        "type": "float"
      },
      "location": {
        "type": "geo_point"
      }
    }
  }
}

POST discuss-location-index/_doc
{
  "lat" : 10.0,
  "lon" : 11.0

}

POST discuss-location-index/_update_by_query
{
  "script": {
    "source": "ctx._source.location = [ctx._source.lon, ctx._source.lat ]",
    "lang": "painless"
  },
  "query": {
    "match_all": {}
  }
}

# Search
GET discuss-location-index/_search
{
  "fields": [
    "*"
  ]
}

# Result
{
  "took" : 0,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 1,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "discuss-location-index",
        "_id" : "ypmVroIBJQ-PaY-rMcqk",
        "_score" : 1.0,
        "_source" : {
          "lon" : 11.0,
          "location" : [
            11.0,
            10.0
          ],
          "lat" : 10.0
        },
        "fields" : {
          "lon" : [
            11.0
          ],
          "location" : [
            {
              "coordinates" : [
                11.0,
                10.0
              ],
              "type" : "Point"
            }
          ],
          "lat" : [
            10.0
          ]
        }
      }
    ]
  }
}

Thank you Stephan,the problem is that I want to do bulk uploading from my DB to elasticsearch via pgsync.Do you have any suggestion for this?

Yes I was thinking that.. but you asked about a painless script so that is the answer you got :slight_smile: That painless script will update all the documents...

Perhaps look at this thread... the last couple posts (It is a bit long... but the OP is loading Geo Data from Postgres using Logstash)

Last Couple Posts..

I do not know pgsync... but in the end you just need to put the lat / lon in the right fields... I would think you would be able to do that... I just not familiar with the tool

Looks to me like you will need to use the transform node rename logic to rename your source fields to the correct fields..

got it.Thanks a lot