Dear Forum,
i have a config which is as follows:
filter {
if [type] == "nginx-default" {
grok {
patterns_dir => "/etc/logstash/patterns"
match => [ "message", "%{COMBINEDAPACHELOG}" ]
}
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss +0100" ]
timezone => "CET"
target => "logdatetime"
}
}
When i look for example into Kibana Table, i see:

Thats correct, as the Date/Time is the same as from the origin message! But when i look into JSON i get a value which differs exact an hour:

I believe its something about timezone or such. Can anyone help to get the correct values also in the JSON view!? I dont want to correct the values in kibana for view or fields, i want to have the values corrected with logstash, while its pushed into elastic, as i process the data later.
Thanks in advance,
regards
@timestamp is always UTC. This isn't configurable.
Well, ok, but how shoould i understand those:

@Timestamp UTC:

timestamp ETC:

logdatetime UTC:

Is there no way to have logdatetime in ETC?
You'd have to use a ruby filter. I think examples of that have been posted in the past.
Hey Magnus,
i know there are examples but i cant get it to work. Can you give a hint why this is not working:
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss +0100" ]
target => "tempdate"
}
ruby {
code => "event.set('logdatetime', event.get('tempdate').time.strftime('%Y-%m-%dT%H:%M:%S +0100'))"
}


You need to convert the timestamp to the local timezone. I don't know OTOH how to do that. Perhaps
t = event.get('tempdate').time
t.localtime('+01:00')
event.set('logdatetime', t.strftime('%Y-%m-%dT%H:%M:%S +0100'))
works.
Works. Made my Day, thanks. For someone else:
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss +0100" ]
target => "tempdate"
}
ruby {
code => "
t = event.get('tempdate').time
t.localtime('+01:00')
event.set('logdatetime', t.strftime('%Y-%m-%dT%H:%M:%S +0100'))
"
}