About logstash date time zone questions


#1

I use logstash-input-jdbc sync data from mysql to elasticsearch, in mysql have time field, the type is timestamp,locale time zone is +8:00, but sync to elasticsearch,the time value less 8 hours, who knows the answer,please help!!!


(Petr Simik) #2

You can define the timezone with date filter
check the reference

https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

I use for instance:

  date {
    match => ["msg_timestamp", "YYYYMMddHHmmss"]
    timezone => "Etc/UTC"
    target => "msg_timestamp"
  }

#3

yes,i used any format,but failed,i use mysql, the time type is timsstamp, which format i should be?


#4

Well, if the format is unix timestamp (epoch), the format should be either UNIX or UNIX_MS (if the timestamp includes milliseconds) according to the documentation: https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html#plugins-filters-date-match

So change the match to

match => ["msg_timestamp", "UNIX"]

If it still does not work, we need to see an example of the data and your config exactly.


#5

In MySQL type is timestamp, the problem is i don't know the type format, in not convert, the elasticsearch show is {"updated_at": "2018-11-16T08:00:01.000Z"}, so which format suit?


#6

In which format is the timestamp in mysql? Can you show a sample of the data? Is it in local time or UTC?
You want to save the data in UTC (with date filter) into Elasticsearch, Kibana will handle conversion to your local timezone if you use it.


#7

Timestamp is one kind of mysql time type, In mysql is local time, but into Elasticsearch is UTC, then less timezone hours,i use the Elasticsearch data by another application but Kibana,so i want the time in Elasticsearch show correctly


#8

I can use another way solve the timezone problem,but that way is so complex. I want use simple way.


#9

Ah okay, now I think I understood.
Right, you know better about the timestamp format.

The date filter in logstash ALWAYS converts to the UTC time. Now what you want, is to have a field which has the timestamp in your own timezone. This is not possible with the date filter and your only option is to use the ruby filter, as you have already done.

If this is what you want, you can also check my other reply about the same issue.


#10

Yes, the link i readed just now. I used ruby filter solve this problem for now, thanks anyway!


(Petr Simik) #11

Within logstash filter you have specified timezone of source data .
In case of source data (mysql) are in localtime (asia/shanghai) , your setup is correct
timezone=>"Asia/Shanghai"

Now you have the data in elasticsearch index inserted correctly

Now you you need to display the data
how do you query the data ?

in case you are using KIBANA you go to
KIBANA->management->Kibana / Advanced Setting-> Timezone for date formatting => select Asia/Shanghai

so you will see the data correctly in format you need.