Dumping last years log data into ES - what do I do about indexing?

(Brian Dunbar) #1

New to ELK, one of my tasks is stuffing 2014 data into ES so Kibana can present it and people will be happy and delighted.

Basically, we've now moved from 'kick the tires' to 'we like it, let's get it in shape so Marketing can use the data'. So .. indexing.

I left the default value of 'Index contains time-based events'. I now have a series of indices, one for each day since the server has been up like so


And so on.

But I'm stuffing data from 2014 in, now. (output clause from my .conf file is below, with what will probably be how I'll manage the file import)

How in the wide world of sports can I edit 'index => ' to include yyyy.mm.dd ? Is there a post-input task?

 zcat /var/log/2014/app01/missuniverse.com-access_log-20141201.gz | /opt/logstash/bin/logstash -w 14 -f fileimport.conf


    output {
        elasticsearch {
            cluster => "elasticsearch.local"
            host => ""
            protocol => http
            index => "logstash-2014"
            index_type => "apache"

(rastro) #2

The default 'index' value includes year, month, and day, so using it would give you daily indexes.

What you're probably asking about is the date{} filter, which would allow you to set the event's @timestamp value to the date from the event itself.

(system) #3