Logstash date filter question

I'm on the 2.0 stack and am using the JDBC input plugin but a couple DB fields that should be dates are coming in as text, "event_date" and "event_timestamp". I'm trying to use the date filter to convert them to dates but can't get it to work. Any help greatly appreciated.

input{
    jdbc{
        ...
    }
}
filter{
    date{
        match => [ "event_date", "YYYY-MM-dd HH:mm:ss Z" ]
        target => "event_date"    
}
output{
    elasticsearch{
        ...
    }
}

Configtest comes back ok but running it I get:

Failed parsing date from field {:field=>"event_date", :value=>2015-11-02 02:24:56 -500, :exception=>cannot convert instance of class org.jruby.RubyTime to class java.lang.String", :config_parsers=>"YYYY-MM-dd HH:mm:ss Z", :config_locale=>"default=en_US", :level=> :warn}

To me this suggests the fields are aleady dates but the Discover tab in Kibana reports them as "t".

Also, am I using the "target" option correctly? I want to reuse the same field but as a date type, or should I be creating another field "event_date_as_date"? I tried commenting that line out to target @timestamp but got the same error. I have 2 date fields seemingly indexed as text so I can't target @timestamp anyway.

You don't need to use the date filter as the field is already a "RubyTime" object.

Thanks for responding Aaron. So why does event_date show up as text in Kibana? And I'm not able to use it in time-based graphs. Also it turns out the 2nd field I mentioned, "event_timestamp", is superfluous and I do need to use "event_date" as the @timestamp for the event. How do I do that? I need to reassign the value of @timestamp to the value of event_date. Don't I need to use the date filter for that?

I tried changing the date filter to a ruby filter:

ruby{
    code => "event['@timestamp'] = event['event_date']"
}

...but got this error:

Ruby exception occurred: The field '@timestamp' must be a (Logstash:Timestamp, not a Time (2015-11-03 02:30:43 -500) {:level=>:error}

I then tried a mutate filter:

mutate{
    replace => { "@timestamp" => "event_date" }
}

... but got this error:

TypeError: The field '@timestamp' must be a (Logstash::Timestamp, not a String (event_date)

All I need to do is use the event_date as @timestamp, what am I missing?

These are excellent questions. It seems that something unusual may be going on in the JDBC input with regards to the date field. I will consult with my colleagues on how we can address that part.

Meanwhile, there's probably a ruby code block which will convert the Time() value into a Logstash::Timestamp. I'll do some digging and see if I can't get a working conversion.

Thanks again Aaron, I really appreciate this.

Still working on a temporary workaround, but here's the issue I raised.

@CraigFoote, this should be a temporary workaround. We'll put this into the JDBC input plugin soon.

Put this in your filter block to convert the RubyTime object to a LogStash::Timestamp object:

ruby {
  code => "event['event_date'] = LogStash::Timestamp.new(event['event_date'])"
}

You may want to wrap this in a conditional to only have it work on the proper events.

Hi Aaron, thanks for the fix. I'm sure it'll work but I can't test it right now. We just undertook the task to install Shield, Watcher and Marvel so everything's down right now. I did run the JDBC source with the "event_date" and "event_timestamp" fields on our pre-2.0 stack cluster and they index as dates. This bug appears to have been introduced with 2.0.

Bad news Aaron, I'm getting an error with your code:

Ruby exception occurred: uninitialized constant LogStash::Filters::Ruby::Logstash {:level =>:error, :file=>"logstash/filters/ruby.rb", :line=>"41", :method=>"filter"}

I'm not a Ruby developer so any more help you could provide would be greatly appreciated.

bump bump

My apologies, I transcribed "LogStash::Timestamp.new" as "Logstash::Timestamp.new". When I capitalized the "S" your code worked. Thanks again.