39m tweets on two nodes requires 11tb on S3?

[may be a dupe --- I sent from the wrong address originally]


ES 0.11.0-SNAPSHOT has been running solid for a couple of months with an S3
Gateway, but there's a serious problem. Status shows that there are 39m
docs with a store size of 50.5 GB, see https://gist.github.com/728313
. An 'ls' of the contents of the nodes shows thousands of '__*.partNN'
files, some dated a couple of months ago. A 'du' of the indices/twitter
directory (2 nodes) shows 11 TB, quite a bit of waste space!

What's the best way to compress and clean up this mess? Are old part files
eligible for deletion? How do I keep this from happening again?