Simple Grok and Date filter problem

Hello, I recently tried to use ElasticSearch and Logstash, and I'm currently trying to get few fields from this log line :

2018-12-05 00:13:45,000 LogType - MachineNumber Message

I use this filter :

filter {
grok {
match => { "message" => "%{DATESTAMP:LogDate} %{WORD:LogType} - %{WORD:MachineNumber} %{GREEDYDATA:Message}" }
}
date {
match => [ "LogDate" , "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "LogDate"
}
}

I'm able to get every fields that I want, but I want the date to be stored as a Date in LogDate. But in Kibana I have a logDate as a string (I guess it failed to convert to Date). And it's transformed as "0018-12-05T00:13:45,000Z".

Do you see anything wrong with my filter ? I must have misunderstood something in the Date thing. It may be a problem with the date format as well, but it seems OK.

It could be that you have already ingested LogDate as a string in the current index. Can you start over with a new index? That filter looks fine.

Thank you @Badger , now my LogDate is now a date !
I still have a problem : this field is filled with 0018-12-12, from the original 2018-12-12. Do you know how my Grok filter can skip the really first character ?

DATESTAMP matches a two digit year. If you try to anchor that pattern to beginning of line

"^%{DATESTAMP:LogDate} %{WORD:LogType} - %{WORD:MachineNumber} %{GREEDYDATA:Message}"

then you get a _grokparsefailure. I would use dissect instead of grok.

dissect { mapping => { "message" => "%{LogDate} %{+LogDate} %{LogType} - %{MachineNumber} %{Message}" } }

Ok, I tried your dissect and it works perfectly !
Thank you so much !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.