How can I convert to geo_point a field extracted with translate filter?

Hi!
I have extracted geo information with translate filter using a dictionary file YAML as follow
location1: 42.1497, -74.9384
location2: 34.5677, -32.8393
.........................................
The logstash conf file translate section is

translate{
		field => "ORIGIN_STATE_ABR"
		destination => "ORIGIN_LOCATION"
		dictionary_path => "/home/alessandro/Scrivania/dataset/us_dictionary.yaml"
	}

mutate {
		
		convert => {

			"ORIGIN_LOCATION" => "string"					

		}			

		convert => { 
			"DEP_DELAY" => "float"
			"ARR_DELAY" => "float"
			"AIR_TIME" => "float"
			"ORIGIN_LOCATION" => "geo_point"
		}
}

When I execute Logstash I have follow error

"Error: Cannot register filter mutate plugin. The error reported is:
Invalid conversion type '["string", "geo_point"]', expected one of 'string,integer,float,boolean'"

What's the problem?

Thank you at all!

Hi, you'd better ask this question in the Logstash forum.

Sorry! I'm confused.
Thank you very much

"Error: Cannot register filter mutate plugin. The error reported is:
Invalid conversion type '["string", "geo_point"]', expected one of 'string,integer,float,boolean'"

Logstash emits JSON documents and those don't have geo_point types. See Geo Point Type | Elasticsearch Guide [1.7] | Elastic for a list of what kind of input Elasticsearch accepts to fields mapped as geo_point.

Thank you for your support magnusbaeck!
I've read your link and I've follow your suggest.
Now, my mapping is

"origin_location": {
        "properties": {
          "location": {
            "type" : "geo_point"
          }
        }
      }

And my logstash.conf section for this field is

translate{
		field => "origin_state_abr"
		destination => "temp_origin_location"
		dictionary_path => "/home/alessandro/Scrivania/dataset/us_dictionary.yaml"
	}

	mutate {
			
		add_field => { "[origin_location][location]"=> "%{tempo_origin_location}"} 
			
		convert => { 
				"DEP_DELAY" => "float"
				"ARR_DELAY" => "float"
				"AIR_TIME" => "float"
			}
	}

But when i run logstash I have this error

"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Mapper for [origin_location] conflicts with existing mapping in other types[mapper [origin_location.location] of different type, current_type [geo_point], merged_type [string]]"}}}, :level=>:warn}

Where's the problem!

Thank you for your support!
I haven't words to thank you.

You have two different mappings for origin_location.location in the same index. It was previously a string but now you're trying to make it a geo_point. You need to reindex.

Thank you magnusbaeck.
The step that I followed are:

  1. I have defined index "air-data-index" with the mapping, where I have specified "geo_point" type
  2. I've used logstash with the file conf.

I've understood the error but how can I re-index?
Must I create a new index and a new mapping?
Thank you!

Create a new index with correct mappings, bulk-copy documents from the old to the new index, and delete the original index. If you want the new index to be usable with the same name as the old one you can add an alias.

Reindexing has been covered here before. Please search the archives.

Thank you for your support magnusbaeck.
I'm very grateful you