Changing tokenizer in Elasticsearch 2.2.0

Hi, I created an index on my cluster as below :

curl -XPUT http://MYADDRESS:PORT/MYINDEX/?pretty -d '{
"settings" : {
"index":{
"analysis":{
"analyzer":{
"korean":{
"type":"custom",
"tokenizer":"seunjeon_default_tokenizer"
}
},
"tokenizer": {
"seunjeon_default_tokenizer": {
"type": "seunjeon_tokenizer"
}
}
}
}
},
"mappings": {
"MYTYPE" : {
"properties" : {
"f1" : {
"type" : "integer"
},
"f2" : {
"type" : "string"
},
"f3" : {
"type" : "date",
"format": "yyyy-MM-dd HH:mm:ss.SSS"
},
"f4" : {
"type" : "date",
"format": "yyyy-MM-dd HH:mm:ss.SSS"
},
"f5" : {
"type" : "string"
},
"f6" : {
"type" : "string"
},
"f7" : {
"type" : "string"
},
"f8" : {
"type" : "string",
"analyzer": "korean"
},
"f9" : {
"type" : "string",
"analyzer": "korean"
},
"f10" : {
"type" : "string"
}
}
}
}
}'

I've done this successfully and inserted over 10 million documents.

What I wanted to do was change the tokenizer settings, so I closed my index, changed the settings, then opened the index as below :

curl -XPOST 'http://MYADDRESS:PORT/MYINDEX/_close'

curl -XPUT 'http://MYADDRESS:PORT/MYINDEX/_settings' -d '{
"index":{
"analysis":{
"analyzer":{
"korean":{
"type":"custom",
"tokenizer":"seunjeon_default_tokenizer"
}
},
"tokenizer": {
"seunjeon_default_tokenizer": {
"type": "mecab_ko_standard_tokenizer",
"index_poses": ["UNK","EP","E","I","J","M","N","S","SL","SH","SN","V","VCP","XP","XS","XR"]
}
}
}
}
}'

curl -XPOST 'http://MYADDRESS:PORT/MYINDEX/_open'

The problem is that now my cluster state is red with none of the shards allocated. Every time I hit "refresh" on head plugin, cluster seems to be randomly reassigning shards then unassigning them. Am I doing something wrong? It's been a little over an hour, and still none of the shards is allocated properly.

Also, cluster seems to be connecting and disconnecting.. Did I do something wrong?

Check your logs, there should be something stating the problem.