Yet another _dateparsefailure

I'm having trouble with a dateparse failure and hoping someone can spot the issue.

Sample log entry:
PSAPPSRV.45377 (947) 2016-01-31T01:00:18.707 GetCertificate Detected time zone is EST

I can grok the data successfully with:
%{DATA:process} (%{INT:processinstance}) [%{TIMESTAMP_ISO8601:appserver_timestamp} %{DATA:action}]%{GREEDYDATA:message}

The result of appserver_timestamp in the above example is:
2016-01-31T01:00:18.707

I have a date filter setup as follows:
date {
timezone => "America/New_York"
match => [ "appserver_timestamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
}

However, I continue to get the "_dateparsefailure" tag on my events in Kibana. Can anyone spot the issue? Also, is there such thing as a grok debugger for date parsing like this one here.

For anyone that comes across this, the fix that worked for me was setting ISO8601 as the match value:
date {
timezone => "America/New_York"
match => [ "appserver_timestamp", "ISO8601" ]
}

Nice!

You example can also be found in the docs

Guys, I have tried all of that, nothing helped in my case. I am getting the _dateparsefailure error in every case and date plugin cannot recognize any format I tried until now.

Sample log entry:

[Wed May 10 08:09:01.176047 2017] [:error] [pid 43899] [client 192.168.2.250:64749] -----------------------REDIS KEY-------------de05890b8d37a54fd995ddad7a60ed180f0f5d52

I created the custom pattern and added in my /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns/grok-patterns file:

APACHE_ERROR_TIME %{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}

My logstash config:

input {
beats {
port => 5044
}
}

filter {

grok {
match => {"message" => '[%{APACHE_ERROR_TIME:timestamp2}] [:%{DATA:messagetype}] [pid %{NUMBER:pid}] [client %{IPV4:proxyaddr}:%{NUMBER:localport}] %{GREEDYDATA:pattern}'
}
}

date {
match => [ "timestamp2", "EEE MMM dd HH:mm:ss yyyy" ]

}
}

output {
elasticsearch {

hosts => ["localhost:9200"]

index => "error_log"

user => "elastic"

password => "changeme"

#}
stdout {
codec => "rubydebug"
}
}

Output:

[root@elkstack54 ~]# /usr/share/logstash/bin/logstash --path.config /etc/logstash/error_log.conf --path.settings /etc/logstash/ -l /tmp
Sending Logstash's logs to /tmp which is now configured via log4j2.properties
log4j:WARN No appenders could be found for logger (org.apache.http.client.protocol.RequestAuthCache).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See Apache log4j 1.2 - Frequently Asked Technical Questions for more info.
{
"offset" => 1401,
"messagetype" => "error",
"timestamp2" => "Wed May 10 08:09:01.176047 2017",
"input_type" => "log",
"pattern" => "-----------------------REDIS KEY-------------de05890b8d37a54fd995ddad7a60ed180f0f5d52",
"pid" => "43899",
"source" => "/logovi/logovi2/example.com-error_log",
"message" => "[Wed May 10 08:09:01.176047 2017] [:error] [pid 43899] [client 192.168.2.250:64749] -----------------------REDIS KEY-------------de05890b8d37a54fd995ddad7a60ed180f0f5d52",
"type" => "errorlog",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_dateparsefailure"
],
"@timestamp" => 2017-05-16T16:51:11.521Z,
"@version" => "1",
"beat" => {
"hostname" => "elkstack54.example.local",
"name" => "elkstack54.example.local",
"version" => "5.4.0"
},
"host" => "elkstack54.example.local",
"proxyaddr" => "192.168.2.250",
"localport" => "64749"
}

I dont know where is the mistake...

Please help guys, I would very appreciate it.

Thank you

Same problem, although I used several different date formats.

I tried

date {
match => [ "timestamp2", "dd/MMM/YYYY:HH:mm:ss" ]
}

,

date {
match => [ "timestamp2", "ISO8601" ]
}

and

date {
match => [ "timestamp2", "yyyy-MM-dd HH:mm:ss.SSS" ]
}

always same result - _dateparsefailure.

timestamp2 field on output always getting same format

example:

"timestamp2" => "Wed May 10 08:09:01.179802 2017"

whatever I specify in date plugin.

Can somebody help if know why this is happening?

Thank you

Any sloution?

I have no idea what to do

Please help if anyone knows why this is happening.

Thank you in advance

You are not including the fractional seconds in your pattern.

date {
match => [ "timestamp2", "EEE MMM dd HH:mm:ss.SSSSSS yyyy" ]
}

Seems that I have no working grok parsing scheme. I think that I have to to open a new topic for this problem. So sorry for bumping this old topic.

@guyboertje Thank you for showing me where the problem is