Parsing timestamp from file and put in @timestamp

Hi all.
I use folowing scheme to collect logs: filebeat -> logstash -> graylog
So i have problem with parsing date and put it to the @timestamp

example of log:


in logstash i use this config:

filter {
if [type] == "idp" {
grok {
match => { "message" => "%{MY2_TIMESTAMP_ISO8601:idp_timestamp}Z|%{IDP:method}|%{IDP:hash1}|%{IDP:domain}|%{IDP}|%{IDP}|%{IDP}|%{IDP:hash2}|%{IDP:login}|%{IDP}|%{IDP}|%{IDP}|%{GREEDYDATA:hash3}" }
overwrite => [ "short_message" ]
date {
match => ["idp_timestamp", MY2_TIMESTAMP_ISO8601]
target => "@timestamp"

patterns file:

IDP [^|]+

so timestamp from log parsed to idp_timestamp field , but dont get into @timestamp
Where is my mistake? Or what i do wrong.

The date filter doesn't use grok patterns. This probably works:

date {
  match => ["idp_timestamp", "yyyyMMdd'T'HHmmss'Z'"]

Why not use a csv filter to parse each line?

thx Magnus!
it works but in another case i use this

date {
  match => ["idp_timestamp", ISO8601]
  target => "@timestamp"


and it also works, but you said that data filter doesn't use grok patterns.

one more question i parsed log which use UTC time , how can i add +3 hours ?

The @timestamp field is always UTC.

The date filter's timezone option is useful if the dates being parsed don't include a timezone and their actual timezone is different from the timezone of the machine where Logstash runs.

do i need to add
timezone => "Europe/Moscow"
in date ?

Not if the system timezone of the machine where Logstash runs is Europe/Moscow.

thank you ! we have changed timezone log timestamp in jetty ))