How to change tokenizer in elasticsearch in the existing index

Hi!

I have the following problem: I have an index of 30 million documents the mapping as follows:

curl -XPUT localhost:8080/xxxxx/yyyyy/_mapping?pretty=true -d '{"xxxxx":{"_id":{"type":"string","index":"not_analyzed"},"properties":{"content":
{"type":"string","store":"no"},"title":{"type":"string","index":"no"},"created_date":{"type":"integer","index":"not_analyzed"},"url":
{"type":"string","index":"not_analyzed"},"author":{"type":"string","index":"no"},"author_url":{"type":"string","index":"no"},"domain":
{"type":"string","index":"not_analyzed"},"lang":{"type":"string","index":"no"}}}}'

Tokenizer is not selected in the settings, so apply a standard. I would like to request "facets" to create ranking links(url) in field "content". Unfortunately I can not do that because the standard tokenizer shared links (url) to pieces. Question: Can an existing index without reindexing change the tokenizer, so that new documents added to the index handle the new tokenizer (uax_url_email) and old documents remain unchanged.

I tried that:

curl -XPUT localhost:8080/xxxxx -d '{
"settings" : {
"index": {
"analysis" :{
"analyzer": {
"default": {
"type" : "custom",
"tokenizer" : "uax_url_email",
"filter" : "lowercase"
}
}
}
}
}
}
'

but I get an error: {"error": "IndexAlreadyExistsException [[xxxxx] Already exists]", "status": 400}

Is there another way to not reindex with query "facets" to create ranking links (url)?

Thank you in advance of any help