Where is _dateparsefailure?

I have many column csv files with few thousands lines

getting
_dateparsefailure

how do I find out which field is throwing this error?
I understand that it is one of the date field is doing this. but which one? I have about 15 of them

any idea?

If you have multiple date filters you could have them add tags if they succeed and at the end see which tags are missing.

you mean like this? which will create field called foo_start_time

date {
         match => ["start_time", "dd-MMM-yy HH:mm"]
         target => "start_time"
      }
     add_tag => [ "foo_%{start_time}" ]
   }

No, I am thinking of something more like

date { match => [ "column1", "..." ] target => "..." add_tag=> [ "column1-OK" ] }
date { match => [ "column2", "..." ] target => "..." add_tag=> [ "column2-OK" ] }
date { match => [ "column3", "..." ] target => "..." add_tag=> [ "column3-OK" ] }

Then use a ruby filter that iterates over the set of tags you expect and get a list of what is missing. Something like this (which I have not tested)...

notOK = []
tags = event.get("tags")
[ "column1-OK", "column2-OK", "column3-OK" ] .each { |x|
    unless tags.include?(x)
        notOK << x
    end
}
event.set("columnsNotOK", notOK)

perefect even when I get this column1-ok, column2-ok I will know which one is failing and correct it.
I will test this out on standard out to get fixed.

Thank you @Badger again.

I added this to creation_date and got tag display back.

then added to
added to start_date but didn't got start_date-OK, hence fix that and started getting this

     [0] "creation_date-OK",
    [1] "start_date-OK",
    [2] "_dateparsefailure"

Move on to few more until I stop getting _dateparsefailure.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.