Datetime from MySQL via JDBC --> @timestamp assignment fails with _dateparsefailure

I'm using the jdbc plugin to grab table data from a MySQL database. One of the fields in the MySQL database ("created_at") is of type DATETIME which I need to assign to the @timestamp.

Here's part of my logstash.conf:

filter {
    date {
        match => [ "created_at","yyyy-MM-dd HH:mm:ss.SSSZ",
                                "yyyy-MM-dd HH:mm:ss.SSS",
                                "yyyy-MM-dd HH:mm:ss",
                                "yyyy-MM-dd'T'HH:mm:ss.SSSZ",
                                "yyyy-MM-dd'T'HH:mm:ss.SSS",
                                "yyyy-MM-dd'T'HH:mm:ss",
                                "ISO8601",
                                "UNIX_MS",
                                "UNIX" ]
        add_field => { "debug" => "timestampMatched"}
        }
    }

Here's output (using rubydebug):

{
         "user_id" => 563,
      "created_at" => 2017-10-11T21:34:07.000Z,
      "spent_time" => 10,
        "@version" => "1",
      "@timestamp" => 2019-12-11T21:40:19.200Z,
            "tags" => [
        [0] "_dateparsefailure"
    ],
              "id" => 1415
}

There are no errors or exceptions in the log file.

If I test this config using echo + command line:

echo '2017-10-11T21:34:07.000Z' | logstash -f logstash.conf

.... I get the expected response (@timestamp is set to incoming time):

{
    "@timestamp" => 2017-10-11T21:34:07.000Z,
         "debug" => "timestampMatched",
      "@version" => "1",
          "host" => "Xxxxxx.local",
       "message" => "2017-10-11T21:34:07.000Z"
}

All the other variations work, too, using the command line.

I've run with the '--debug --config.debug=true' flags but still I see no exceptions or errors to aid in debugging.

I'm using "logstash.version"=>"6.5.4"

How can peel back the layers to see what's going on?
How can I debug this further?

That is already a LogStash::Timestamp (you can tell because there are no quotes around the value), and a date filter cannot parse that. Basically the jdbc plugin already did the conversion for you.

Ah, that's helpful and the link in your post gave me the fix I needed.

The Answer

Since, as Badger has explained, "created_at" was already a LogStash::Timestamp and the date filter can't parse that (despite my lengthy list of parsing options), I need to turn the "created_at" to a string so it can be parsed.

I added this part to my config, just above the date

mutate {
    convert => { "created_at" => "string"}
    }

That worked! The @timestamp now matches the passed-in "created_at".

Thanks for the help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.