Hi I try to import a CSV file in logstash.
but I have problem, the csv file first is date and time,
I use separator to "timestamp", but it + 8 hours (my time zone is +8)
so I how to set time don't +8 ?
This happens because in Elasticsearch every date field is stored in UTC and Kibana will convert this time in UTC to your local time, so if your date is not in UTC you need to tell that to Elasticsearch to avoid this confusion.
In your case, your date does not have any information about the timezone, so Elasticsearch will assume that 2022/10/17 19:17:53 is in UTC, and when Kibana converts this time to your time zone of +8, you will get the 2022/10/18 03:17:53.000.
You need to use a date filter in your Logstash pipeline to tell elasticsearch that your field is not in UTC.
date {
match => ["timestamp", "yyyy/MM/dd HH:mm:ss"]
timezone => "+0800"
target => "timestamp"
}
You can use the numeric offset or the canonical timezone name, the difference is that using the numeric offset you know what is the time difference just looking at the config and using the canonical timezone you may need to look anywhere else unless it is some timezone you already know.
Yes, some countries, use DST others don't. Berlin is +1h and +1h for DST => +2h right now.
I'm not right, somewhere exists a good developer.
For the end... End of DST in Europe is 30. October, but... In the United States, Canada, and Mexico’s northern border cities, Daylight Saving Time (DST) ends on Sunday, November 6, 2022 .
TimeZones are excellent to make mess in a cloud environment especially for bank's logs or revenues
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.