I'm a Chinese developer, our timezone is +08:00,the problem using logstash is that @timestamp is always formatted as @timestamp" => "2015-07-25T16:00:30.000Z, the input time is 2015-07-26 00:00:30. This problem will cause 1 day log to be spliced to two indexes:logstash-2015.07.25 and logstash-2015-07.26
I tried to fix it by add logged_date field to represent 2015-07-26 in +08:00 timezone, however, in kibana, all the date fields will be added 08:00 hours, which causes incorrect logged_date in Chinese timezone
Could anyone give me a solution on this problem? I googled around and found no proper solution.
I've read the user guide for logstash date filter, the timezone parameter for date filter is used to parse input log time not for output, so it cannot be used to change the output @timestamp timezone
yes, es and logstash use utc 00:00 is ok for kibana, but logstash elasticsearch output plugin use @timestamp to format the index_name, as a result, one day log in +08:00 timezone will be indexed into two different indexes, which is confused for us in +08:00 timezone,although, kibana will fix this problem when displaying query result
Hi, I want to know your solution for this. We have also the problem. As known to us all that logstash will use UTC and kibana automatically converts the timezone to user's browsers local time, which is fine.
But we also want to use filebeat and logstash, instead of parsing and sending data to elasticsearch, we just want to keep the log files and spliced them in server's local time, and use our customized scripts to parse the log files.
So, is there any way to configure the logstash settings to generating log files in server time instead of in UTC?
the myindex_last_run.txt as
--- 2017-01-23 18:30:00.297000000 Z
Before I use jdbc_default_timezone => "UTC", my query will follow the myindex_last_run.txt that stored in UTC, so the log said that:
select * from sales where sales_date > '2017-01-23 18:30:00' <== Wrong
After I use the jdbc_default_timezone => "UTC" then my query change to
select * from sales where sales_date > '2017-01-24 01:30:00' <== right value
I'm frustrated for several weeks and just remove the document that doubled by the query when I restart the logstash.
Now, I can sleep well with this configuration....
Kibana converts the UTC based data to client browser's timezone, which is pretty fine when using the ES + Kibana + Logstash stack.
What I am asking about is how to change the logstash's default timezone to Beijing Time, then use file plugin to ship data to write to disk for storage only, instead of into ES.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.