Use "best_compression" index setting in 2.1.1?

The index.codec setting for "best_compression" option, is marked as experimental in the 2.1.1 docs:

Is this in fact still experimental? Seems like it is being marketed & discussed enough to be here to stay.

I'm asking, because I'd like to use it, but not if it will cause problems down the road if it is removed. E.g. if I setup my index to use best_compression and then in 2.xxxxx it is removed what would I have to do to overcome that? Reindex all data in order to upgrade?

It would be rather absurd that future versions could not read older index segments. The compression is in Lucene, and since ES 2.x the ES team spends lot of effort to check for backward compatibility in this area. It would be like gzip was suddenly unable to decompress data from 1999.

Here is my guess: the "best compression" seems experimental not because it will be removed, but just because there is not much gain in relation to the standard compression ratio, and it won;t become default setting. The CPU cost seems not to justify the little saved place.

You should test it and report your findings, maybe it can be worth to keep the setting, against common expectations.

Relates #16644, "[docs] Does index.codec best_compression still need to be experimental?".