CSV filter - Create Index per log timestamp

Hi

I want to create ES index based on the dates matching from the logfile. I am using logstash CSV filter to process the logs. For instance, the log data appears like below

2016-02-21 00:02:32.238,123.abc.com,data
2016-02-22 00:04:40.145,345.abc.com,data

Below is the logstash configuration file. Obviously the index will be created as testlog, however, i want the index to be created as testlog-2016.02.21 and testlog-2016.02.22, given that YYYY.mm.dd is the logstash preferred format for index dates. I have done this with grok filters, and I am trying the achieve the same with csv, but this doesn't seem to work.

filter {
 csv {
    columns => [ "timestamp", "host", "data" ]
    separator => ","
    remove_field => ["message"]
    }
}
output {
    elasticsearch {
            hosts => ["localhost:9200"]
            index => "testlog"
    }
}

We are on Logstash 2.1.0, ES 2.1.0 and Kibana 4.3.0 version

Any inputs appreciated

Take a look at https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-index

Thanks, after changing the output filter like below, the index is however created based on today's date. I want the index to be created based on date in log file.

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "testlog-%{+YYYY.MM.dd}"
    }
}

Index:
yellow open testlog-2016.03.12 5 1 11 0 6.8kb 6.8kb

Then you need to add a date filter to your config :slight_smile:

Included the date filter and tried adding a new field called logdate, which is being referenced in the index name. I am getting a "_dateparsefailure" now. I am sure I am doing definitely wrong,

filter {
    csv {
     columns => [ "timestamp", "host", "data" ]
     separator => ","
     remove_field => ["message"]
   }
 date {
            match => [ "@timestamp", "YYYY-MM-dd" ]
            add_field => { "logdate" => "@timestamp" }
    }
}
output {
   elasticsearch {
        hosts => ["localhost:9200"]
        index => "testlog-%{logdate}""
    }
}

That is not;

You need to take the time into account as well, take a look at http://grokdebug.herokuapp.com/patterns# to see if you can find anything that matches.

        match => [ "@timestamp", "YYYY-MM-dd" ]

It's the timestamp field you're parsing so this should be:

        match => [ "timestamp", "YYYY-MM-dd" ]
       add_field => { "logdate" => "@timestamp" }

Why are you duplicating the timestamp into another field?

I got the same dateparsefailure with the timestamp as well, so I changed it to see if @timestamp works

    date {
            match => [ "timestamp", "YYYY-MM-dd" ]
    }

As @warkolm said you need to adjust your date pattern so that it matches your timestamp format. According to what you wrote earlier you also have hours, minutes, seconds, and milliseconds.