Change Mappings of Large Elastic Search Index

Hi all,
I have a scenario, in which logstash writes logs data to elastic search indices, but initially due to limited knowledge, I didn't configured any index template, therefore after reaching counts of 1 million documents, elasticsearch is performing really poorly.

I now have an index template in place with dynamic mappings, that would not analyze any String fields at all. But how I can I change the mappings of previously created indices containing existing data a of a million documents?

You'll have to reindex the old data into a new index. You might be able to use Logstash for that (@warkolm has posted a gist with that configuration) but there are also other options like es-reindex.

1 Like

Appreciate your reply Magnus,

I think the gist you are rerferring to is the following one:

looking at it, seems like I'll have to work with index at a time , and execute for all indices one by one.

Same seems to be the scenario with es-reindex.

Can you point me to the right direction here.

Thanks!

I don't think it matters whether you use Logstash or es-reindex. Under the hood they'll use the same scan-and-scroll and bulk indexing mechanism. The latter uses HTTP which theoretically performs slightly worse than the node or transport protocols that you might be using in the Logstash case but that isn't necessarily significant. When I did some reindexing a while back I got ~5000 msg/s with es-reindex.

Thanks for all the help, I ended up using logstash to reindex the data.