Load CSV File with Logstash and mutate fields in numbers

Hi!
I'm trying to import a csv file with logstash.
I've used csv filter but I need to convert some fields types in float.
This is my conf file

input{

file {
	path => "/home/alessandro/Scrivania/dataset/aft_air_data_small.csv"
	start_position => beginning
}

}

filter {

csv {

	columns => [
		"YEAR",
		"AIRLINE_ID",
		"FL_NUM",
		"ORIGIN_AIRPORT_ID",
		"ORIGIN",
		"ORIGIN_CITY_NAME",
		"ORIGIN_STATE_ABR",
		"ORIGIN_STATE_NM",
		"DEST_AIRPORT_ID",
		"DEST",
		"DEST_CITY_NAME",
		"DEST_STATE_ABR",
		"CRS_DEP_TIME",
		"DEP_TIME",
		"DEP_DELAY",
		"CRS_ARR_TIME",
		"ARR_TIME",
		"ARR_DELAY",
		"CANCELLED",
		"AIR_TIME"
	]
	separator => ","

	remove_field => ["YEAR", "CRS_DEP_TIME", "CRS_ARR_TIME"]
}

mutate {
		convert => { 
			"DEP_DELAY" => "float" 
			"ARR_DELAY" => "float" 
			"AIR_TIME" => "float" 
		}
}

}
output {
elasticsearch {

	hosts => ["localhost:9200"]
	action => "index"
	index => "aft_air_cont_index"
}

Are there errors? Thank you at all!

You configuration looks okay. What's the problem? Can you give us example input and the output you get?

Thank you for your support!
I've resolved. My csv file was malformed.
Thank you!

Hi,
I tried this one but I with Longitude and Latitude as a location
But I couldn't generate tile map, any help?

@amran1, please start your own thread for your unrelated question (and when you do, please include more details about your configuration, what your input file looks like, etc).