I use logstash-input-jdbc sync data from mysql to elasticsearch, in mysql have time field, the type is timestamp,locale time zone is +8:00, but sync to elasticsearch,the time value less 8 hours, who knows the answer,please help!!!
You can define the timezone with date filter
check the reference
I use for instance:
date { match => ["msg_timestamp", "YYYYMMddHHmmss"] timezone => "Etc/UTC" target => "msg_timestamp" }
yes,i used any format,but failed,i use mysql, the time type is timsstamp, which format i should be?
Well, if the format is unix timestamp (epoch), the format should be either UNIX or UNIX_MS (if the timestamp includes milliseconds) according to the documentation: https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html#plugins-filters-date-match
So change the match to
match => ["msg_timestamp", "UNIX"]
If it still does not work, we need to see an example of the data and your config exactly.
In MySQL type is timestamp, the problem is i don't know the type format, in not convert, the elasticsearch show is {"updated_at": "2018-11-16T08:00:01.000Z"}, so which format suit?
In which format is the timestamp in mysql? Can you show a sample of the data? Is it in local time or UTC?
You want to save the data in UTC (with date filter) into Elasticsearch, Kibana will handle conversion to your local timezone if you use it.
Timestamp is one kind of mysql time type, In mysql is local time, but into Elasticsearch is UTC, then less timezone hours,i use the Elasticsearch data by another application but Kibana,so i want the time in Elasticsearch show correctly
I can use another way solve the timezone problem,but that way is so complex. I want use simple way.
https://discuss.elastic.co/uploads/short-url/a93ng1fxTCS3eJCGpJ3tCt6kFGV.png
Ah okay, now I think I understood.
Right, you know better about the timestamp format.
The date filter in logstash ALWAYS converts to the UTC time. Now what you want, is to have a field which has the timestamp in your own timezone. This is not possible with the date filter and your only option is to use the ruby filter, as you have already done.
If this is what you want, you can also check my other reply about the same issue.
Yes, the link i readed just now. I used ruby filter solve this problem for now, thanks anyway!
Within logstash filter you have specified timezone of source data .
In case of source data (mysql) are in localtime (asia/shanghai) , your setup is correct
timezone=>"Asia/Shanghai"
Now you have the data in elasticsearch index inserted correctly
Now you you need to display the data
how do you query the data ?
in case you are using KIBANA you go to
KIBANA->management->Kibana / Advanced Setting-> Timezone for date formatting => select Asia/Shanghai
so you will see the data correctly in format you need.
I used the Elasticsearch data by Service, like Java,if Elasticsearch show wrong, all Service read the wrong data.
Usually it is best to store the data in UTC, so it will be easy to convert into different timezones later on in presentation layer (kibana, any other service using the data).
But in some cases I understand that this is not viable - especially if you have lots of legacy software using it. As stated before, if you really need to store the data in a specific timezone, you need to use the Ruby filter which you have done
Cheers!
Thanks very much,I don't want to be interwined with this problem any more, just using Ruby, thanks again!
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.