I wonder about logstash YYYY.MM.dd

I am not sure what logstash YYYY.MM.dd is based on.

I'm in asia/seoul area and I also sent a message at 1am. The 13th was 1:00 a.m. on the 14th, but there was no newly created 14-day index in elasticsearch.

I searched for other similar questions and saw that @timestamp is the standard.

Obviously, the timestamp was 14 days, but there was no new 14-day index.

If I am doing something wrong, can I make YYYY.MM.dd into the future if I change @timestamp into the future using logstash's date filter?

It's UTC, always UTC.

If you don't have a date filter in your Logstash config to make sure @timestamp is correctly calculated, that may be what you are seeing.

Does that mean that the value of timestamp has no effect on YYYY.MM.dd?

It does, but again it all defaults to UTC.

Thank you for answer.

If so, can logstash change the affected UCT to the @timestamp value I changed?

Logstash will always treat that as if it's UTC.

Is there any way to change it to something other than UTC?

@timestamp is always stored and printed as UTC. If you want to create a string with the local time, you'll have to use Ruby.

filter {
  ruby {
    code => "event.set('[@metadata][localtime]', event.get('@timestamp').time.localtime.strftime('%Y.%m.%d %H:%M:%S'))"
  }
}
1 Like

I use the date filter to change the value of @timestamp in the message I have.(logstash1)

And after passing it to kafka as output,

I index into elasticsearch using another logstash2.

Then set in logstash2
index => %{+YYYY.MM.dd}
Is it determined by the value of @timestamp I changed in logstash1?

I think this is the problem i have.

You are really working against the way the system is designed.

Elasticsearch uses UTC, so the indices that are created are done so using UTC, and the data is then stored as UTC (with a timezone).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.