Create a new field type date from csv with logstash

Hi all,

Thanks in advance for your help here
Just trying to figure out what I am doing wrong.
I've got a field from a csv file called 'time' and looks like : '07:00:01.000' and I'm trying to create the field 'eventdate' as below (to use it as time filter then for my Kibana index, so it has to be recognize as a date type automatically by elastic)

mutate { add_field => { "eventdate" => "%{+YYYY.MM.dd} %{time}" } }
date {
  locale => "en"
  match => ["eventdate", "YYYY.MM.dd HH:mm:ss.SSS"]
  target => "eventdate"
}

this one worked well but the eventdate was converted to my local server time (so if time=7am, eventdate would be 8am ...)

My logstash stdout{} looks good with evendate at 7am but once inserted in Kibana, it gets one hour more ...
I would prefer to keep my kibana settings like they are right now, just try to import a date as it goes out of logstash ...

Thanks all ! :slight_smile:

Check in Kibana the parameters of your space, mainly Kibana use by default your browser time zone

Hi @ylasri
Thanks for answering
Found it and fixed now

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.