Using convert => date in csv filter

What am I missing here? When I run this configuration

input { generator { message => '2018/03/16,2018-02-15,2003-11-19 13:01:59' count => 1 } }
output { stdout { codec => rubydebug } }

filter {
  csv { convert => { "column1" => "date" "column2" => "date" "column3" => "date_time" } }
}

I get the following, where column1 clearly was not converted.

       "column1" => "2018/03/16",
       "column2" => 2018-02-15T05:00:00.000Z,
       "column3" => 2003-11-19T13:01:59.000Z,
    "@timestamp" => 2018-04-11T17:03:41.220Z

The csv filter uses CSV::Converters, and the documentation for those say that the :date converter "Converts any field Date::parse() accepts". When I run "logstash -i irb" it appears to me to parse the column1 format just fine

irb(main):001:0> Date.parse('2018-02-15')
=> #<Date: 2018-02-15 ((2458165j,0s,0n),+0s,2299161j)>
irb(main):002:0> Date.parse('2018/02/15')
=> #<Date: 2018-02-15 ((2458165j,0s,0n),+0s,2299161j)>

I can workaround this using a date{} filter, but I am curious how I can find out what formats are accepted by a convert => date.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.