Logstash-jdbc-input Timezone issue


(Gan Wei Shi) #1

When I use logstash-jdbc-input plugin to import data from Sql Server, the imported time is utc-based which is not same with my origin data. How to solve this?

Second, the sql_last_value always logged in UTC timezone either, how can I log it in my local timezone?
Thanks in advance.


(Magnus Bäck) #2

When I use logstash-jdbc-input plugin to import data from Sql Server, the imported time is utc-based which is not same with my origin data. How to solve this?

Elasticsearch always stores timestamps in UTC. Quoting https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html:

"Elasticsearch uses a set of preconfigured formats to recognize and parse these strings into a long value representing milliseconds-since-the-epoch in UTC."

Second, the sql_last_value always logged in UTC timezone either, how can I log it in my local timezone?

I don't think you can. Why does this matter?


(Gan Wei Shi) #3

Actually I use it to sync my data to ES according to the timestamp like below:

    statement => "SELECT * FROM t_Expert WHERE operateTime>:sql_last_value"

If their timezone is different, I can't use the sql.


(Magnus Bäck) #4

You can convert local timestamps to UTC in the query. My T-SQL is rusty so I don't have any example.


(moe55555) #5

Hi Slion , currently I encounter same issue , did you find out the solution ?


(Nelson Nunes) #6

Same issue here.

How to solve this issue ?

thanks,


(Gan Wei Shi) #7

Not yet. Seems the param is useless.


(Enderson Maia) #8

I used the jdbc_default_timezone to define the timzone for the database I'm connecting, and it worked for me.

Now the @timestamp and my table's date column are in the same timezone, so I can use date > :sql_last_value in my statement for jdbc-input-plugin.


(system) #9