_dateparsefailure in ISO8601 date strings

I want to pull logs from an mssql server and parse their date strings. unfortunately I can not figure out why it can not parse the date. My filter looks like this:

filter {
date {
match => [ "evtime", "ISO8601" ]
timezone => "UTC"
target => "@timestamp"
add_field => { "debug" => "timestampMatched"}
}
fingerprint {
source => ["@timestamp", "evhost", "evinfo"]
method => "SHA1"
target => "[@metadata][generated_id]"
}
}

It does not work if there is a timezone option or not. The entrys logstash produces look like this:

{
"evuser":null,
"evhost":"host",
"evinfo":"Filter: xxx",
"tags":["_dateparsefailure"],
"@version":"1",
"evtime":"2021-05-14T18:02:32.000Z",
"@timestamp":"2021-12-23T12:08:00.750Z"
}

Any hints whats wrong with my timestamps?

Not sure why are seeing that. I tested the exact same and get the right results.

Conf

input {
  generator {
    message => '{ "evtime":"2021-05-14T18:02:32.000Z" }'
    count => 1
    codec => "json"
  }
}
filter {
  date {
    match => [ "evtime", "ISO8601" ]
    timezone => "UTC"
    target => "@timestamp"
    add_field => { "debug" => "timestampMatched"}
  }
}
output {
  stdout { codec =>  "json" }
}

Output

{
    "@version": "1",
    "sequence": 0,
    "evtime": "2021-05-14T18:02:32.000Z",
    "@timestamp": "2021-05-14T18:02:32.000Z",
    "debug": "timestampMatched",
    "host": "Aarons-MBP"
}

I'm a bit suprised that it works for you. I can't find anything why it might not but it doesn't. Starting logstahs with --debug doesn't give any hint that it can't process the date string. Is there another way to get the date filter to give more information thand adding [ tag_on_failure ]?

The only thing I can't see in your configuration that I do different is using the json codec. Do you at some point use that codec or json filter?

I hadn't configured it explicitly, the JDBC Plugin already gives it as JSON. But when I configure json explicitly it doesn't change. It still does not process the date without giving any hint on why.

The fingerprinting-filter does work so I guess it is not a problem of reading the input in general.

I wonder if you are hitting this issue. The jdbc input will convert date/time columns into LogStash::Timestamp objects, which a date filter cannot parse. You could try using mutate+convert to make [evtime] a string before parsing it.

Thanks a lot @Badger, this seems to be the problem. Not that I'm suprised. Since I started with elastic in June I found bug after bug. Since elastic had this logstash prepared for me, saying „this should work“ I thought it might be a problem with my environment. But I'll test it after the holidays and ask them to have a look.

So, after the holidays I had the time to test it and it is definateley that issue. It is a bit frustrating that the documentation doesn't mention that the input type for date hast to be a string and that there is no useful error message suggesting the type mismatch. That could have saved me a lot of time but at least it is consistend with what I saw when encountering bugs in the other elastic components.

So if anyone else stumbles onto this thread:

The jdbc plugin seem to receive dates as type date. The date filter isn't able to process dates as dates and wants a string (but doesnt tell you... because its a log funnier to have this bug for over 3 years and let everyone stumble over it). Fix:

filter {
  mutate {
    convert => {"date_fileld" => "string"}
  }
  date {
    match => ["date_fileld", "ISO8601"]
  }
}

Many thanks @Badger for pointing me to that issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.