Copy data from old index/indice over to new one


I implemented an ELK Stack in our environment a few days ago and originally created two indexes that I specified in both Logstash configuration files; "logstash_syslogs" and "logstash_netflow".

A few days later I realized that having two separate indexes aren't necessary and won't work out with having one main dashboard to visualize the data. I went ahead and changed the configurations to output to a single "logstash-events" index.

I had over 5 million logs on the old "logstash_syslog" index and was wondering if I can transfer that data over to my new index "logstash-events".

Is this possible?

Just fetch all from the old index and bulk index it into the new.
On python I would do it like this:
from elasticsearch import Elasticsearch,helpers a=helpers.scan(es,query={"query":{"match_all": {}}},scroll='1m',index=INDEX_NAME_old,doc_type=TYPE_NAME) c=0#if you do not how many docs are too big for a bulk request then try and error ;-) for aa in a: if c%500=0: es.bulk(body=t2,request_timeout=30) t2=[] op_dict = { "index": { "_index": INDEX_NAME_new, "_type": TYPE_NAME, "_id": aa["_id"] } } data_dict=aa["_source"] t2.append(op_dict) t2.append(data_dict) c+=1
And last bulk:

Sorry I forgot how to get idndetation to work here

If you are on 2.3.x and these indexes are in the same cluster then have a
look at reindex.

I am currently on 2.2.1 and running everything on Ubuntu 16.04, forgot to mention that sorry.