What is the best way to compress a snapshot data?

We have a lot of data in our daily indexes (like 200GB), so what is the best way to compress a daily index snapshot. As we need to store some of them up to 6 month due to legal requirements.

I think you could read that: https://www.elastic.co/blog/store-compression-in-lucene-and-elasticsearch?q=compress%20blog

So basically when you have cold data, change the compression type and call _optimize to create new segments with the new compression.