taro_x
February 29, 2020, 8:27am
1
Logs in the following formats are now failing:
"timestamp_str": "Feb 29 00:00:00"
filter config:
filter {
date {
match => ["timestamp_str", "MMM dd HH:mm:ss", "MMM d HH:mm:ss"]
timezone => "Asia/Tokyo"
target => "timestamp"
}
}
I guess it is due to the leap year because there was no problem until yesterday.
My Logstash process is running for at least 6 months.
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.filters.parser;
This file has been truncated. show original
The time format we have has no year listed, so we'll have to guess the year.
"Feb 29 " should be parsed as "2020-02-29".
But "Feb 29 2019" is not correct.
parser = DateTimeFormat.forPattern(pattern).withDefaultYear(clock.read().getYear()).withLocale(locale).withZone(DateTimeZone.forID(timezone));
If this initialization process is executed only once at the starting time, "Feb 29 " might be parsed as "Feb 29 of 2019".
What should I do for this? Restarting Logstash may help?
system
(system)
Closed
March 28, 2020, 8:27am
2
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.