Change timestamp to be timestamp from logfile

Hi ,

Below is one log time i'm trying to parse via logstash

pvl-api-ft15.sdsdsdasa 2017-05-24 13:45:06,297 INFO http-/0.0.0.0:8080-56 c.s.s.f.s.r.i.RateLimitingRequestFilter RATE_LIMIT | path /AccountServices/V1/ServiceV1_9/balance

My match pattern is %{URIHOST:Server} 20%{DATESTAMP:TimeStamp} %{CISCO_REASON}-%{URIPATHPARAM} %{BASE16FLOAT}%{JAVACLASS}%{BASE16FLOAT}.%{JAVACLASS} RATE_LIMIT \S+ path %{URIPATH:Servicepath}+%{GREEDYDATA:extra_fields}

the issue is i want the time stamp to be the time stamp from my logfile and not the time it got parsed into elasticsearch,

date {
match => [ "TimeStamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}

I tried the above and didnt work;-(

There are multiple problems:

  • Your grok filter can't possibly work correctly because the DATESTAMP pattern doesn't match YYYY-MM-DD dates. I suggest you use TIMESTAMP_ISO8601 instead. With your example things happen to work anyway, but that's only because 17 is a valid day and 05 is a valid month.
  • You're capturing the timestamp in a way that doesn't include the century, so a yyyy-MM-dd date pattern won't work. yy-MM-dd should be okay though.

If that doesn't help, please show an example event, either as reported by a stdout { codec => rubydebug } output or via copy/paste from the JSON tab of Kibana's Discover panel.

Thanks for your reply ! I got it working by doing the below
mutate {
gsub => [
# replace all commas with dots
"TimeStamp", ",", "."
]
}
mutate {
gsub => [
"TimeStamp", " ", ";"
]
}
date {
locale => "en"
match => ["TimeStamp", "YYYY-MM-dd;HH:mm:ss.SSS"]
timezone => "Europe/London"
target => "logTimestamp"
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.