I'm using the jdbc plugin to grab table data from a MySQL database. One of the fields in the MySQL database ("created_at") is of type DATETIME which I need to assign to the @timestamp.
Here's part of my logstash.conf:
filter {
date {
match => [ "created_at","yyyy-MM-dd HH:mm:ss.SSSZ",
"yyyy-MM-dd HH:mm:ss.SSS",
"yyyy-MM-dd HH:mm:ss",
"yyyy-MM-dd'T'HH:mm:ss.SSSZ",
"yyyy-MM-dd'T'HH:mm:ss.SSS",
"yyyy-MM-dd'T'HH:mm:ss",
"ISO8601",
"UNIX_MS",
"UNIX" ]
add_field => { "debug" => "timestampMatched"}
}
}
Here's output (using rubydebug):
{
"user_id" => 563,
"created_at" => 2017-10-11T21:34:07.000Z,
"spent_time" => 10,
"@version" => "1",
"@timestamp" => 2019-12-11T21:40:19.200Z,
"tags" => [
[0] "_dateparsefailure"
],
"id" => 1415
}
There are no errors or exceptions in the log file.
If I test this config using echo + command line:
echo '2017-10-11T21:34:07.000Z' | logstash -f logstash.conf
.... I get the expected response (@timestamp is set to incoming time):
{
"@timestamp" => 2017-10-11T21:34:07.000Z,
"debug" => "timestampMatched",
"@version" => "1",
"host" => "Xxxxxx.local",
"message" => "2017-10-11T21:34:07.000Z"
}
All the other variations work, too, using the command line.
I've run with the '--debug --config.debug=true' flags but still I see no exceptions or errors to aid in debugging.
I'm using "logstash.version"=>"6.5.4"
How can peel back the layers to see what's going on?
How can I debug this further?