@timestamp not updated using date filter

Hello,
I have the following syslog message that I'm injecting using logstash into elasticsearch:
<189>2019-09-05T16:44:31.766338+02:00 172.x.x.x date=2019-09-05 time=16:44:30

Using logstash I'm first putting the syslog timestamp into @timestamp

date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601" ]
}

Later, I'm trying to interpret the date/time contained in the payload and extract using kv{}:

 date {
   match => [ "%{date} %{time}", "yyyy-MM-dd HH:mm:ss" ]
   tag_on_failure => ["%_fail_date"]
 }

%_fail_date is not set, so the statement must have been successful.

Actual result: (the time retrieved from syslog)
@timestamp Sep 5, 2019 @ 16:44:31.766

Expected result: (the time set later by using "date" again)
@timestamp Sep 5, 2019 @ 16:44:30.000

Thanks.

UPDATE To be on the safe side I now also set add_tag => ["%_success_date"]. It doesn't appear either :thinking: The code before and after calling date is run though. (Can tell by the fields that are added and removed)

Not true. tag_on_failure only occurs if the filter attempts (and fails) to match the source field. If the source field does not exist then the date filter is a no-op. I do not think a date filter accepts a sprintf reference like that, you would need to mutate+add_field to create the field and then use a date filter to parse it.

Ok I forgot to not think in binary. Thanks :slight_smile:

This works:

  mutate {
    add_field => { "[@metadata][ts]" => "%{date} %{time}" }
  }
  date {
    match => [ "[@metadata][ts]", "yyyy-MM-dd HH:mm:ss" ]
  }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.