Date Parser error

I am using Winlogbeat to ship Windows Event Logs to LogStash.

According to "codec=>rubydebug", the JSON output contains a field named "@timestamp". So, I used Date Filter to parse it and got an error:


JSON Display

"@timestamp" = "2016-02-29T21:01:37.300Z"

logstash.json

date {
match => ["@timestamp" , "ISO8601" }
target => "arrive_at"
}

I see no "arrive_at" field in the output.

Error

message=>"failed parsing date from field",:field=>"@timestamp", :value=>"2016-02-29T21:01:37.300Z", exception="cannot convert instance of class org.jruby.RubyObject to class java.lang.string", :config_parsers=>"ISO8601",..."


I thought ISO8601 will match this date format.

Also, this date is 5 hrs ahead of my time zone here. How do I convert it to EST?

Thank you anyone who can help me on this. I have been trying to understand how Date Filter works but have failed miserably.

The date filter purpose is to convert a string into a UTC time object, stored by default in the @timestamp.
So you cannot re-apply it to such object but I admit the error is way too cryptic.

The @timestamp field can be used as-is in Elasticsearch and Kibana. Elasticsearch expect it to be in UTC and Kibana has the option to display the date with the correct user-timezone.

If you want to modify this field for another purpose, please explain further

Hello wiibaa,

Thank you for the reply.

Even if I do NOT use Date Filter, @timestamp is still there in the output. I think LogStash automatically creates it for every Windows event. Is it true? If so, should I still use Date Filter?

My initial intent of using Date Filter is to convert the date in "2016-02-29T21:01:37.300Z" format into something that I want. For example, I like the format to be in "Feb 29 2016 21:01:37". However, after reading your explanation, I am sure I should fool around with @timestamp because Kibana relies on this timestamp in the Zulu format to determine the real time based on the local time of the machine. Am I right?

Note that I have started learning Elastic Stack 5 days ago part-time. So, my knowledge in this topic is pretty weak.

Based on your explanation "The date filter purpose is to convert a string into a UTC time object, stored by default in the @timestamp", once the date string is converted into a UTC time object, how can I manipulate it ? Can I extract year, day or hour (%{YYYY), %{HHH},...) using field references after Date Filter is applied?

Sorry for all these questions. I am still learning. I am still confused with how to use Date Filter.

Edison

Even if I do NOT use Date Filter, @timestamp is still there in the output. I think LogStash automatically creates it for every Windows event. Is it true?

I don't know for sure if it's Winlogbeat that populates the @timestamp field or if it's the receiving Logstash instance that does it.

If so, should I still use Date Filter?

Possibly, but parsing the @timestamp field doesn't make sense. If you have another timestamp field in the events you probably want to use the date filter on that.

My initial intent of using Date Filter is to convert the date in "2016-02-29T21:01:37.300Z" format into something that I want. For example, I like the format to be in "Feb 29 2016 21:01:37".

No, that's not what the date filter is for. Don't fight the system. Leave the format of the @timestamp field alone and fix your presentation layer to format timestamps according to your preference.