Best way to truncate a date field?

I'm ingesting old logs and want to use a data field value in the index name:

elasticsearch{
    index => "logstash-systemname-%{day}"
    ...
}

It's a LogStash::TimeStamp field but I want to copy it and truncate the copy to day precision.

So which is better?

#1

ruby{
    code => "
        event['day'] = event['log_timestamp'];
        event['day'] = event.sprintf('%{+YYYY-MM-dd}');
    "
}

[note that #1 has failed for me on occasion, not sure why]

#2

ruby{
    code => "
        event['day'] = Time.parse(event['log_timestamp'].to_s).strftime('%Y-%m-%d');
    "
}

Or is there a better, third option?

Beware that in your #1 case you are using the @timestamp field when doing event.sprintf('%{+YYYY-MM-dd}') and not your log_timestamp.

Is it out of the question to simply change the @timestamp value and that's why you are not using the date filter ?

filter {
  date {
    #please use the correct pattern(s)
    match => ['log_timestamp, 'yyyy-MM-dd HH:mm:ss']
    add_field => { 'day => '%{+YYYY-MM-dd}' }
  }
}

Beware that in your #1 case you are using the @timestamp field when doing event.sprintf('%{+YYYY-MM-dd}') and not your log_timestamp.

That explains the odd behaviour I was seeing sometimes, thanks.

I am actually using the date filter to "reset" the @timestamp for the event to the datetime in the event's log_timestamp field. I just omitted that code.

But I want only the day portion in the index name. Not sure how I would use the date filter to do that.

I really should finish reading the post before replying :frowning: I just saw your date filter snippet and I'll give it a try, thanks.

Thanks @wiibaa, that worked and is much cleaner than ruby code.