Logstash-jdbc-input Timezone issue

When I use logstash-jdbc-input plugin to import data from Sql Server, the imported time is utc-based which is not same with my origin data. How to solve this?

Second, the sql_last_value always logged in UTC timezone either, how can I log it in my local timezone?
Thanks in advance.

When I use logstash-jdbc-input plugin to import data from Sql Server, the imported time is utc-based which is not same with my origin data. How to solve this?

Elasticsearch always stores timestamps in UTC. Quoting https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html:

"Elasticsearch uses a set of preconfigured formats to recognize and parse these strings into a long value representing milliseconds-since-the-epoch in UTC."

Second, the sql_last_value always logged in UTC timezone either, how can I log it in my local timezone?

I don't think you can. Why does this matter?

Actually I use it to sync my data to ES according to the timestamp like below:

    statement => "SELECT * FROM t_Expert WHERE operateTime>:sql_last_value"

If their timezone is different, I can't use the sql.

You can convert local timestamps to UTC in the query. My T-SQL is rusty so I don't have any example.

Hi Slion , currently I encounter same issue , did you find out the solution ?

Same issue here.

How to solve this issue ?

thanks,

Not yet. Seems the param is useless.

I used the jdbc_default_timezone to define the timzone for the database I'm connecting, and it worked for me.

Now the @timestamp and my table's date column are in the same timezone, so I can use date > :sql_last_value in my statement for jdbc-input-plugin.

2 Likes