I am pretty new to Elasticsearch.
Now I am trying to upgrade my cluster for the matter of better performance and functionalities, in particular, aggregation pipelining feature.
In order to run a proper aggregation, I would like to use copy_to
option rather than script
.
(My problem is, the data I want to aggregate is in several different fields)
I followed the documentation here for copy_to
and here for updating _mapping
.
However, it seems the copy_to
affects only new data but not for existing one.
I run curl -XPUT http://localhost:9200/_mapping/type?pretty=true -d '{ "properties" : { "fieldA" : { "type" : "string" , "copy_to": "fieldB" } } }'
and checked the mapping is updated as below.
...
{"properties" : {
"fieldA" : {
"type" : "string",
"copy_to" : "fieldB"
},
...
},
...
Am I missing something?
Is there any clever way to copy data in existing field to another field?