Logstash timestamp with CEST


(kalangara) #1

Hi all
I have a timestamp in the log which has the follwing format

",xxxxxxx,Thu May 31 09:40:46 CEST 2018,yyyyyy...."

I tried to have the config
...
...
filter {
if ([path] =~ /log/) {
csv {
columns => [
"eventId","eventTime","eventLevel"
]
separator => ";"
}
}
}

the eventTime i am getting it as string and would like to have the datatype as date.

Regards, ginu


(Magnus Bäck) #2

You can use a date filter to parse the string and store it as an ISO8601 timestamp that Elasticsearch will treat as a date. Keep in mind that the existing mapping of the field won't change so you'll have to reindex existing data or just make sure a new index is created.


(kalangara) #3

Thanks Magnus
sorry again for asking may be a silly question...
does it mean that I need to

filter {
date {
match => [ "eventTime", "ISO8601" ]
}
}


(Magnus Bäck) #4

Yes, except not ISO8601 since the date you want to parse isn't in ISO8601 format. See the date filter documentation for more information.


(kalangara) #5
  • I tried

date {
match => [ "eventTime", "EEE MMM dd HH:mm:ss yyyy" ]
}
did not work!

  • I also tried:

              mutate {
               gsub => [ "eventTime", "CEST", "" ]
              }
              date {
                  match => [ "eventTime", "EEE MMM dd HH:mm:ss yyyy" ]
              }
    

and created a new Index ... it still created the eventTime as String and had the value as Mon Jun 18 16:59:35 2018
but did not store it as timestamp.


(Magnus Bäck) #6

If you want the date filter to write the parsed result back to the same field you need to adjust the target option.


(kalangara) #7

Thanks Magnus that helped!!
with the following code i was able to achieve what I wanted

mutate {
  gsub => [ "eventTime", "CEST", "" ]
 }
date {
  match => [ "eventTime", "EEE MMM dd HH:mm:ss yyyy" ]
  target => "eventTime"
 }

Great!!!


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.