Date filter not parsing

Hi all,

I've been trying to parse my date, the logfile is in the following format:
15 Aug 2016 00:00:03,821|INFO |snjcprddex1|

This is my conf file:
input {
file {
path => "I:/LOGFILES/AthenaScenario/."
start_position => beginning
ignore_older => 0



filter {

grok {
patterns_dir => ["I:/LOGFILES/CONF/patterns"]
match => ["message","%{TIMERECORD:timerecord}|"]

date {
match => ["timerecord" , "dd MMM YYYY HH:mm:ss,SSS"]
target => "timerecord"
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "athena"
workers => 1
{codec => rubydebug}

And my pattern looks like this:

Logstash is somehow not parsing the date, and not showing any error message either.
Please advise!!

What do the resulting events look like, i.e. what the output of your stdout output?

It shows:

"message" => "13 Aug 2016 23:52:46,982|INFO |snjcprddex30",
"@version" => "1",
"@timestamp" => "2016-08-23T03:33:41.542Z",
"path" => "I:/LOGFILES/CAPITAL.PROD.7024.snjcprddex30.l
"host" => "v10153257"

Your grok expression is wrong. You don't seem so be escaping the | so you're either matching against TIMERECORD or an empty string. In this case I expected TIMERECORD to match anyway but for some reason it doesn't. Change your grok expression to %{TIMERECORD:timerecord} \|.

Thanks. However, now i get a grokparsefailure.

  "tags" => [
[0] "_grokparsefailure"


Oops. Make the grok expression ^%{TIMERECORD:timerecord}\|. Leading ^ and no space before \|. You should also consider using the csv filter for this kind of data.

Thanks! Although that grok expression still didnt work, I'm using a simple csv filter now and I'm able to get what I want. Thanks again!