Elasticsearch and gzip

New ES user. Got the ELK stack built. Now I'm importing just north of 100Gb in log files, from 2014.

They are, of course, gzip'd.

Can I do this with the 'file' ? Or do I need to throw it at ES with curl?

Or use input stdin:

zcat file | bin/logstash -f logstash.conf

Thank you, David.