_dateparsefailure while using JDBC plugin

I am trying to read some data from a database, but when i try to parse a date field i get _dateparsefailure while the field i am trying to convert is in ISO8601 format.

Here is my full logstash config:

input {
  jdbc {
    jdbc_driver_library => "---"
    jdbc_driver_class => "---"
    jdbc_connection_string => "---"
    jdbc_user => "---"
    jdbc_password => "---"
    schedule => "* * * * *"
    statement => "select calldate from ---"
    
  }
}

filter {
  
  date {
    match => [ "calldate", "ISO8601" ]
    target => "db_time"

  }

}
output {
  stdout {}
}

it connects to the database but when it tries to pars the calldate field it throw and error. Here is a sample output:

{
      "calldate" => 2025-02-15T13:07:31.000Z,
      "@version" => "1",
          "tags" => [
        [0] "_dateparsefailure"
    ],
    "@timestamp" => 2025-02-16T11:05:00.341017552Z
}

I checked, for example when i run the timestamp : 2025-02-15T13:07:31.000Z with the following config it works fine:

input {
  stdin { }
}
 
 
filter {
  date {
    match => [ "message", "ISO8601" ]
    target => "db_time"
  }
}
 
 
output {
  stdout{}
}

am i doing some obvious mistake?

Sorry to ask a redundant question but i belive this is a duplicate question.

Can you provide the message or calldate how show up in the filter, without any transformation?

"calldate" => 2025-02-15T13:07:31.000Z - without quotes, this is already in the timestamp format

"calldate" => "2025-02-15T13:07:31.000Z" - with quotes, this would be the string format for the date conversion.

As mentioned in this duplicates post, the jdb input already converts date columns into timestamp values in Logstash.

So there is no need in using a date filter into those values and doing so results in this error as logstash is trying to conver a field into its internal date type, but the field is already converted.

1 Like