Cannot create new index/template with geo-point field?

Hi, new to this and have been wrestling with this for the past few days. I am trying to get a Logstash .csv import to Elasticsearch to work correctly, but am running into trouble with getting my geo-point field in.
Specifically, the error I am getting is:

            {
              "error": {
                "root_cause": [
                  {
                    "type": "illegal_argument_exception",
                    "reason": "Rejecting mapping update to [my-index-name] as the final mapping would have more than 1 type: [_doc, node_points]"
                  }
                ],
                "type": "illegal_argument_exception",
                "reason": "Rejecting mapping update to [my-index-name] as the final mapping would have more than 1 type: [_doc, node_points]"
              },
              "status": 400
            }

The ES index I am trying to use is (note I have also tried creating a template to accomplish this, yes matching my index names exactly, and still getting the same error) :

PUT /my-index-name
{
  "settings" : {
        "index.number_of_shards" : 1
    },
    "mappings" : {
        "_doc" : {
            "properties" : {
                "location" : { "type" : "geo_point"}
            }   
        }
    }
} 

Logstash config:

input {
  file {
      mode => "read"
      path => ["C:/logstash-6.4.2/sample/*.csv"]
      start_position => "beginning"
    #  delimiter => "\n"
      file_chunk_size => 1048576
      file_completed_log_path => "C:\logstash-6.4.2\sample\filelog.txt"
      file_completed_action => "log"
      sincedb_path => "NUL"
  }
}
filter {
    csv {
        separator => ","
        columns => ["fieldname1",... ]
        convert => {
            "fieldname1" => "float"
            …. #lots of these
          "QuadBinLong_Lat_Deg" => "float"
          "QuadBinLong_Lon_Deg" => "float" 
        }
    }
    mutate { add_field => { "datetime" => "%{SystemTime_Date}%{SystemTime_Local}" } }
    date {
        match => ["datetime","yyyyMMddHHmmss.SSS","yyyMMddHHmmss.SS"]
        remove_field => ["datetime"]
    }

    mutate { rename => {"QuadBinLong_Lat_Deg" => "[location][lat]"} }
    mutate { rename => {"QuadBinLong_Lon_Deg" => "[location][lon]"} }
    mutate { remove_field => ["message"] }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "my-index-name"
    template => "C:\elastic\mytemplate.json"

  }
   stdout { codec => rubydebug}
}

I have made sure to delete the index every time before running Logstash, but the same error returns. I think I am missing something in the creation of the index/template, but am not sure what. The goal is to load a ton of numeric data, set the @timestamp field to a concatenation of stored date & time data, and get the two latitude and longitude fields into an elasticsearch geo-point format. I should be able to make both maps and time series visualizations from this data set, but at the moment cannot seem to do either.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.