Dumping last years log data into ES - what do I do about indexing?

New to ELK, one of my tasks is stuffing 2014 data into ES so Kibana can present it and people will be happy and delighted.

Basically, we've now moved from 'kick the tires' to 'we like it, let's get it in shape so Marketing can use the data'. So .. indexing.

I left the default value of 'Index contains time-based events'. I now have a series of indices, one for each day since the server has been up like so

logstash-2015.06.04
logstash-2015.06.05

And so on.

But I'm stuffing data from 2014 in, now. (output clause from my .conf file is below, with what will probably be how I'll manage the file import)

How in the wide world of sports can I edit 'index => ' to include yyyy.mm.dd ? Is there a post-input task?

 zcat /var/log/2014/app01/missuniverse.com-access_log-20141201.gz | /opt/logstash/bin/logstash -w 14 -f fileimport.conf

fileimport.conf

(blah)
    output {
        elasticsearch {
            cluster => "elasticsearch.local"
            host => "127.0.0.1"
            protocol => http
            index => "logstash-2014"
            index_type => "apache"
        }
    }

The default 'index' value includes year, month, and day, so using it would give you daily indexes.

What you're probably asking about is the date{} filter, which would allow you to set the event's @timestamp value to the date from the event itself.