Date Filter

I can't seem to get this date filter to work:

"logstash.version"=>"6.1.0"
Elasticsearch 5.6.3
filter{
grok {
patterns_dir => ["/etc/logstash/patterns"]
match => [ "message", "%{LOGTIMESTAMP:logTimestamp}" ]
}
date {
locale => "en"
match => ["logTimestamp", "MMM D HH:mm:ss"]
target => "logTimestamp"
}
}

Log entry is like so:
{
"_index": "logstash-2018.02.02",
"_type": "doc",
"_id": "AWFT0dubiPn6kkLXSbjY",
"_score": 1,
"_source": {
"logTimestamp": "Feb 1 23:52:34",
"@version": "1",
"message": "{"@timestamp":"2018-02-01T23:52:34.763Z","@metadata":

{"beat":"filebeat","type":"doc","version":"6.1.0","topic":"Capsule_logs"},"source":"/data_0/logs/company/sandbox-dal-9-data.company.com/postgresql343/postgresql343.log","offset":12112324,"message":"Feb 1 23:52:34 bluemix-sandbox-dal-9-data.company.com postgresql343: haproxy_status.23 | 2018/02/01 23:52:34 hastatus response time 12.295422ms; cmd time 12.26587ms; response code 503","tags":["postgresql"],"prospector":{"type":"log"},"beat":{"name":"syslog.internal","hostname":"syslog.internal","version":"6.1.0"}}",
"@timestamp": "2018-02-02T00:01:57.337Z",
"tags": [
"_dateparsefailure"
]
},
"fields": {
"@timestamp": [
1517529717337
]
}
}

I want to use the timestamp for the logs found in the message. Why do I keep getting the _dateparsefailure ? The grok pattern works and logTimestamp gets dumped into kibana as a string.

Thanks!

Try "en-US" or "en_US" for locale.

That causes the date filter to ignore the logTimestamp and match the @timestamp attribute.

Are you using a JDBC input filter? Related Issue?

When the date filter can't parse a string it'll log clues about what it's having problems with.

match => ["logTimestamp", "MMM D HH:mm:ss"]

"D" is day-of-year. Use "d" instead. I think you'll also need to specify a second pattern with "dd" instead of "d".

Hi. I've tried this as well but am still having the issue where the match string now matches the Time attribute rather than the log timestamp. My current config looks like so:
input {
kafka {
bootstrap_servers => "kafka02.company.net:9093"
topics => ["Capsule_logs"]
}
}

filter{
grok {
patterns_dir => ["/etc/logstash/patterns"]
match => [ "message", "%{LOGTIMESTAMP:logTimestamp}" ]
}

    date {
        timezone => "UTC"
        match => ["logTimestamp", "MMM  d HH:mm:ss"]
        target => "logTimestamp"
    }
}

output {
elasticsearch {
hosts => ["https://user:password.deployment-logs.company.com:17825/"]
ssl => true
ssl_certificate_verification => true
}
}

I tried the locale parameter which caused a dateparse error. I realize once this is working I'll have to add an additional match parameter for dd.

Thanks!

Hey. That's a good thought. I've updated the GH issue.

I've tried this as well but am still having the issue where the match string now matches the Time attribute rather than the log timestamp.

What do you mean? Please show examples instead of describing the results.

filter:
filter{
grok {
patterns_dir => ["/etc/logstash/patterns"]
match => [ "message", "%{LOGTIMESTAMP:logTimestamp}"]
}
date {
match => ["logTimestamp", "MMM dd HH:mm:ss"]
target => "logTimestamp"
}
}

Pattern:

LOGTIMESTAMP %{MONTH} +%{MONTHDAY} %{TIME}

LogSample:
{"@timestamp":"2018-02-12T20:53:59.319Z","@metadata":{"beat":"filebeat","type":"doc","version":"6.1.0","topic":"Capsule_logs"},"message":"Feb 12 16:46:59 server.company.com mongodb315: mongos.24 | 2018-02-12T16:46:59.472+0000 I SHARDING [Balancer] distributed lock with ts: 5a81c503a6805c4336799218' unlocked.","tags":["mongodb"],"prospector":{"type":"log"},"beat":{"name":"syslog.internal","hostname":"syslog.internal","version":"6.1.0"},"source":"/data_0/logs/compose/server.company.com/mongodb315/mongodb315.log","offset":1337412}

logTimestamp is matched:

"logTimestamp": "Feb 12 16:46:59"

But with a _dateparsefailure. I don't understand why the date filter fails to match even though the patterns match.

If the date filter fails it'll log a message that points to the error.

With Logstash logging set to debug. In the logs what I see is:
[2018-02-13T00:16:11,475][DEBUG][logstash.pipeline ] output received {"event"=>{"tags"=>["_dateparsefailure"], "message"=>"{"@timestamp":"2018-02-13T00:16:07.784Z","@metadata":{"beat":"filebeat","type":"doc","version":"6.1.0","topic":"Capsule_logs"},"source":"/data_0/logs/company.com/postgresql463/postgresql463.log","offset":8754897,"message":"Feb 12 23:25:22 company.com postgresql463: postgres.24 | Updating the TTL for primary.","tags":["postgresql"],"prospector":{"type":"log"},"beat":{"name":"syslog.internal","hostname":"syslog.internal","version":"6.1.0"}}", "@version"=>"1", "@timestamp"=>2018-02-13T00:16:10.933Z, "logTimestamp"=>"Feb 12 23:25:22"}}

The logTimestamp is being set but the event is being tagged with a dateparsefailure. I don't see anything specifically noting why this is happening, just that it is happening.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.