Using date filter with field from autodetected column in csv plugin

I am trying to parse a date from a column, which name is autodetected using the csv plugin. The problem is that the date plugin is doing nothing with this field. If instead of using "autodetect_column_names" I write the name with "columns", it works. Why is this happening? How can I make this work with autodetect_column_names?

    file { path => "/path/to/file.csv" }
    csv {
        separator => ","
        autodetect_column_names => true
    date {            
        match => [ "timestamp_column", "dd/MM/yy HH:mm:ss ZZZ" ]
        target => "test_column"
    stdout{ codec => "rubydebug"}

Thank you so much in advanced.

Did you read that

Logstash pipeline workers must be set to 1 for this option to work.

Yes. It is set to 1 since the beginning.

Hi there,

can you please post a sample of a output of that pipeline (the one with the autodetect)?


Sorry, I cannot share it because it contains sensitive information. But I have tested with a small toy example and it is working. I don't know why

That's why I asked you to post here some content. My guess is you're parsing something wrong.

If your data are sensitive, you can mask the other fields and values by putting "foo1"-"value1", "foo2"-"value2". I'm not interested in that but in what your timestamp_column looks like.

But if that where the case, it didn't make sense that it works with naming the columns instead of using the autodetect.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.