Below is one log time i'm trying to parse via logstash
pvl-api-ft15.sdsdsdasa 2017-05-24 13:45:06,297 INFO http-/0.0.0.0:8080-56 c.s.s.f.s.r.i.RateLimitingRequestFilter RATE_LIMIT | path /AccountServices/V1/ServiceV1_9/balance
My match pattern is %{URIHOST:Server} 20%{DATESTAMP:TimeStamp} %{CISCO_REASON}-%{URIPATHPARAM} %{BASE16FLOAT}%{JAVACLASS}%{BASE16FLOAT}.%{JAVACLASS} RATE_LIMIT \S+ path %{URIPATH:Servicepath}+%{GREEDYDATA:extra_fields}
the issue is i want the time stamp to be the time stamp from my logfile and not the time it got parsed into elasticsearch,
date {
match => [ "TimeStamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}
Your grok filter can't possibly work correctly because the DATESTAMP pattern doesn't match YYYY-MM-DD dates. I suggest you use TIMESTAMP_ISO8601 instead. With your example things happen to work anyway, but that's only because 17 is a valid day and 05 is a valid month.
You're capturing the timestamp in a way that doesn't include the century, so a yyyy-MM-dd date pattern won't work. yy-MM-dd should be okay though.
If that doesn't help, please show an example event, either as reported by a stdout { codec => rubydebug } output or via copy/paste from the JSON tab of Kibana's Discover panel.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.