Hi
I use logstash to put my IIS log files to elseicsearch. I use my own grok filter:
"message", "%{TIMESTAMP_ISO8601:request_timestamp} %{IP:server_ip} %{WORD:request_method} %{URIPATH:request_uri} %{NOTSPACE:request_query} %{NUMBER:port} %{NOTSPACE:username} %{IP:client_ip} %{NOTSPACE:request_agent} %{NOTSPACE:request_referer} %{INT:response_status} %{INT:response_substatus} %{INT:response_winstatus} %{INT:response_timetaken}"
Then I use DSL query to elasticsearch to get the logs where request_timestamp field is in specified date range, and there are logs from outside the range.
However if I take range using @timestamp column the results seems to be fine - so In my opinion the problem is with field type.
I have even defined an index template where my field definitione is:
"request_timestamp" : {
"type" : "date",
"format" : "yyyy-MM-dd HH:mm:ss"
}
but it still does not work (default logstash definitione was text).
How to solve it? Or even better how to use date from my logs as @timestamp column, which in my specific case is completelly useless.