(Issue) Logstash JDBC Input Plugin seems to be changing datetime column value's at output

Hello,

I have an SQLite database where new data is written every second. Currently, I am using Logstash and its JDBC plugin to read the data in real-time and ship it to Elasticsearch for visualization. Everything is working great, except for one strange issue. One of the columns in this SQLite database is called "Time", containing the datetime value of when the event occurred. For some reason, Logstash seems to be overwriting the time portion of all these values as 04:00:00:000Z (e.g. If the value of the column is "2011-07-16 10:00:11" then Logstash outputs "2011-07-16T04:00:00.000Z"). This occurs for every row, every single time. I can not figure out why this is happening, or even stranger, why it seems to default the hour to 04.

Some things to help with troubleshooting...

Versions and OS

Logstash 7.15.1
Windows 10

My Logstash configuration file


input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlite:C:\Analysis_Int\EventExport.db3"
    jdbc_driver_library => "C:\tmp\sqlite-jdbc-3.36.0.3.jar"
    jdbc_driver_class => "org.sqlite.JDBC"
    jdbc_user => ""
    statement => "SELECT * FROM FuelConsumed WHERE ID > :sql_last_value"
    schedule => "*/5 * * * * *"
    use_column_value => true
    tracking_column_type => "numeric"
    tracking_column => "id"
    last_run_metadata_path => "C:\tmp\logstash-7.15.1\last_run_metadata.log"
  }
}

output {
    stdout {
    }
}

SQLite column names and datatypes


|ID|4|INTEGER|
|---|---|---|
|TimelineID|12|TEXT|
*|Time|12|DATETIME|*
|UnitID|12|TEXT|
|UnitName|12|TEXT|
|UnitClass|12|TEXT|
|UnitSide|12|TEXT|
|FuelQuantity|6|DOUBLE|
|FuelType|4|INTEGER|
|MiscInfo|12|TEXT|

Another example of how the values of database differ from Logstash and JDBC output

In SQL

{
       "fuelquantity" => 0.001666667,
       "@version" => "1",
       "unitname" => "S 31 Sabalo",
       "unitside" => "Venezuela",
       "fueltype" => 4001,
       "unitclass" => "S 31 Sabalo [Type 209-1300]",
       *"time" => 2011-07-16 10:04:10,*
       "@timestamp" => 2021-11-03T17:18:20.287Z,
       "timelineid" => "14654039-c619-4bd5-8ac4-6988c31ad87e",
       "id" => 71908,
       "miscinfo" => nil,
       "unitid" => "8813afed-2d28-4f14-ae3b-3e0c6bb3f780"
}

Logstash and JDBC Output

{
       "fuelquantity" => 0.001666667,
       "@version" => "1",
       "unitname" => "S 31 Sabalo",
       "unitside" => "Venezuela",
       "fueltype" => 4001,
       "unitclass" => "S 31 Sabalo [Type 209-1300]",
       *"time" => 2011-07-16T04:00:00.000Z,*
       "@timestamp" => 2021-11-03T17:18:20.287Z,
       "timelineid" => "14654039-c619-4bd5-8ac4-6988c31ad87e",
       "id" => 71908,
       "miscinfo" => nil,
       "unitid" => "8813afed-2d28-4f14-ae3b-3e0c6bb3f780"
}

I do not see any difference between the [time] values in logstash and SQL.

The code assumes your database times are in the local timezone and logstash stores everything as UTC. That's why 4 hours get added.

If the [time] field in the DB includes a time I would expect that to end up in the LogStash::Timestamp.

That was an error, sorry, I have changed the question to reflect the actual comparison now.

Not sure what to say. The problem is most likely down in the sequel gem (see here and here). I don't see any obvious bugs that would cause it to only parse the date part of the string, but then I do not know that codebase at all.

Looking at this issue, it looks fairly easy to write very minimal test scripts in ruby. It should only require a handful of lines of code to see what comes back from the database.

There is something with JDBC that throws a fit at the SQL datetime datatype, but I found a sloppy work around. I will try to clean this up later with an actual fix but for right now and anyone else who is having this error, you can just cast your datetime column to a string, then in the Logstash config file apply a filter to cast it back to the correct timestamp.

SQL Query Example
"Select Cast(Time As varchar(50)) As StringTime, * From FuelConsumed WHERE ID > :sql_last_value"

Logstash Config Example

filter {
    date {
      locale => "en"
      timezone => "EST"
      match => [ "stringtime", "yyyy-MM-dd HH:mm:ss" ]
      target => "@timestamp"
    }
}

Will write back if I find an actual fix. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.