Reindex to add geo_point mapping

I have a CSV file from a standalone application with the following header:

fid, owner, location.lat, location.lon

I use the CSV import functionality in Kibana to acquire and index the data. Using the Kibana developer tool, I examine the dynamic mapping produced:

GET /test_and_delete/_mapping

which returns:

{
  "test_and_delete" : {
    "mappings" : {
      "_meta" : {
        "created_by" : "ml-file-data-visualizer"
      },
      "properties" : {
         "fid" : {
           "type" : "long"
         },
         "owner": {
            "type": "text"
         },
         "location": {
            "properties" : {
              "lat" : {
                "type" : "double"
              },
              "lon" : {
                "type" : "double"
             }
           }
         }
      }
   }
}

I want the location to be mapped to a geo_point instead of two doubles. I understand that this requires reindexing to a new index with an explicit mapping. So I create the target index and add a mapping:

PUT /test_and_delete_mapped
{
  "mappings": {
    "properties": {
      "location": {
        "type": "geo_point"
      }
    }
  }
}

Then I reindex using:

POST _reindex
{
  "source": {
    "index": "test_and_delete"
  },
  "dest": {
    "index": "test_and_delete_mapped"
  }
}

This fails with the error:

"cause": {
        "type": "mapper_parsing_exception",
        "reason": "Could not dynamically add mapping for field [location.lon]. Existing mapping for [location] must be of type object but found [geo_point]."
      },

Any clues what I am missing here? The existing mapping cannot be a geo_point?

Hi @jbrowe
You are close but as the error says geo_points can not be dynamically converted type so you are going to need to do something like this. Create a little ingest pipelinee to help with the transformation

See if you can follow this along same principal you just need to change to fit your data

DELETE /city

# Put in non geo_point document 
POST /city/_doc
{
  "name": "north stamford",
  "location": {
    "lon": -73.572866,
    "lat": 41.142307
  }
}

GET /city/_search

GET /city/_mapping

DELETE /citynew

# Put the new mapping 
PUT /citynew
{
  "mappings": {
    "properties": {
      "name" : {"type" : "text"},
      "location_fixed": {
        "type": "geo_point"
      }
    }
  }
}

# check the mapping 
GET /citynew/_mapping

  
DELETE /_ingest/pipeline/convert_geo

# Create an ingest pipeline
# Sets the old location into the new geo_point field and drops the old location.
PUT /_ingest/pipeline/convert_geo
{
  "processors": [
    {
      "set": {
        "field": "location_fixed.lat",
        "value": "{{location.lat}}"
      }
    },
    {
      "set": {
        "field": "location_fixed.lon",
        "value": "{{location.lon}}"
      }
    },
    {
      "remove": {
        "field": "location"
      }
    }
  ]
}

#Now reindex using the ingest pipeline
POST _reindex/
{
  "source": {
    "index": "city"
  },
  "dest": {
    "pipeline": "convert_geo", 
    "index": "citynew"
  }
}

# Check out your new index 
GET /citynew/_search

Hope this Helps!

In the future you may want to use logstash to do the ingest and do the conversion before it ever gets to elastic search ... but that is another technique for another day...

1 Like

That did help. Although on the first pass I received an error when trying to remove the "location" field. I deleted the processor:

 {
      "remove": {
        "field": "location"
      }
 }

It worked fine after that, although I have a remnant "location" field in the index.

Thank you

Glad I could help, the remove should work you might need to remove location.lat and location.lon keep at it

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.