How to update a field type of existing index in Elasticsearch

Hi,

I need to change field type, but I don't know how knowing that I'm using logstash for importing data from CSV file so I didn't specify the field type manually and I think the default value is a string, but I need to change it to long or integer, what is the command that allows updating mapping settings or at least fields type?

Thank you!

2 Likes

You cannot do that. If you want to force a non-default data type, in most cases you need to explicitly specify it in the mapping definition (there are some exception such as geo_point, which can be recognized by name)

2 Likes

You could change the logstash index template and provide your own mapping.

Or create manually a mapping (before sending the first data): https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html

If you already indexed docs, you can't change the mapping so you will need to reindex.

3 Likes

I tried to create manually a mapping when I check the result it seems working even with having some weirds warnings when I use logstash command but concretly it behaves like a string I mean when I try to do for exemple 1+1 (1 is my field value) it returns 11 not 2 !!!!

Where did you try your 1+1, in a script?

Exactly ! I'm tring to do this

POST /{index}/{type}/{id}/_update
{
    "script" : "ctx._source.{field}+=1"
}

I would first check GET /{index}/{type}/_mapping to see if the field is indeed long

2 Likes

I used something like that

PUT {index}
{
"mappings": {
    "{type}" : {
        "properties" : {
            {field} : {"type" : "long"}
        }
    }
}
}

You should check if the field type has changed. You are not able to change a field type if it is already there. For example if you put a data entry without defining a field mapping, a field mapping with default type will be created and you cannot change it. You have to delete and recreate the index

2 Likes

Yes that exactly what I did ! I deleted my index and I created the mapping and then I sended the data from CSV file with logstash. that seems working when I check the new type but logstash display a warning

"error"=>"WriteFailureException; nested: MapperParsingException[failed to parse [field]]; nested: NumberFormatException[For input string: \"field\"]; "}}, :level=>:warn}

@dadoonet How can I change the logstash file conf to include fields type knowing that i'm using this syntax :

 input {
  file {
    type => "............."
	path => ..........
	start_position => "beginning"
	sincedb_path => "/dev/null"
  }
}

filter {
	if [type] == ".........." {
		if [message] =~ /^COL1,COL2,COL3/ {
			drop { }
		}
	  	csv {
		    	columns => ["COL1","COL2","COL3"]
		    	separator => ";"
		    	source => message
		    	remove_field => ["message","host","path","@version","@timestamp"]
		}
	}
}
 
output {

  	if [type] == ".........." {
  	    elasticsearch {
		    hosts => "localhost:9200"
		    index => ".........."
		    document_type => ".........."
	  	}
  	}
  	
}

Thank you !

This is more a question for #logstash group but here we go: https://www.elastic.co/guide/en/logstash/2.3/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-template

2 Likes

Thank you very much for your time ! Indeed I found the solution here

https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-convert

and here

https://www.elastic.co/guide/en/logstash/current/plugins-filters-csv.html#plugins-filters-csv-convert