Date field in logstash

Hi

I use logstash to put my IIS log files to elseicsearch. I use my own grok filter:
"message", "%{TIMESTAMP_ISO8601:request_timestamp} %{IP:server_ip} %{WORD:request_method} %{URIPATH:request_uri} %{NOTSPACE:request_query} %{NUMBER:port} %{NOTSPACE:username} %{IP:client_ip} %{NOTSPACE:request_agent} %{NOTSPACE:request_referer} %{INT:response_status} %{INT:response_substatus} %{INT:response_winstatus} %{INT:response_timetaken}"

Then I use DSL query to elasticsearch to get the logs where request_timestamp field is in specified date range, and there are logs from outside the range.
However if I take range using @timestamp column the results seems to be fine - so In my opinion the problem is with field type.
I have even defined an index template where my field definitione is:
"request_timestamp" : {
"type" : "date",
"format" : "yyyy-MM-dd HH:mm:ss"
}
but it still does not work (default logstash definitione was text).
How to solve it? Or even better how to use date from my logs as @timestamp column, which in my specific case is completelly useless.

@waitangi

Have you verified if the timezone of your query matches the timezone of your request_timestamp values?

Got it. I added timezone => "UTC" to date definition in configuration file, Then I added remove_field => [ "request_timestamp" ] and now I have in @timestamp parsed date from the log file and no request_timestamp column. It suits me fine.
Thanks davemoore.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.