Error when converting Date

Trying to add date from field (start_date) to @timestamp using the following filter:

date
{
match => [ "start_date", "yyyy-MM-dd HH:mm:ss Z" ]
}

start_date is in the format: 2015-07-26 11:54:43 +0100

I get the following error:
:exception=> "cannot convert instance of class org.jruby.RubyTime to class java.lang.String", :config_parsers=>"yyyy-MM-dd HH:mm:ss Z"

What am i doing wrong?

Are you getting this error in KB or LS?

In logstash.

Ah, cause you posted this in the KB section :slight_smile:

Can you paste your entire config?

Don't know if this is a solution but you can have a look at my config.

Here's my config

input {
jdbc {
type => "test"

...

statement => "select c.`starttime` from `mydb`.`table` c 
where c.`starttime` >= '2015-07-26'"

}
}

output { stdout {codec => rubydebug}}

filter {
if [type] == "test" {

date
{
match => [ "starttime", "YYYY-MM-dd HH:mm:ss Z" ]

}

}
}

And error message:
Failed parsing date from field {:field=>"starttime", :value=>2015-07-26 17:32:35 +0100, :exception=>
"cannot convert instance of class org.jruby.RubyTime to class java.lang.String", :config_parsers=>"Y
YYY-MM-dd HH:mm:ss Z", :config_locale=>"default=en_US", :level=>:warn}
{
"starttime" => 2015-07-26 11:43:09 +0100,
"@version" => "1",
"@timestamp" => "2015-07-27T10:12:16.405Z",
"type" => "test",
"tags" => [
[0] "_dateparsefailure"
]
}

Is your output really before your filter?

The issue is the date obtained from the database comes in as ruby date rather than string. So the date filter is unable to convert the date to a string.

A workaround that works is to convert date in the sql statement to a string using MySQL DATE_FORMAT.

Nonetheless I should be able to do this say using ruby filter. Any thoughts?

I have the same issue with oracle did you figure this problem out?

Converted date to string using sql function date_format worked for me.

I got rid of that error but now the dates do not match!
Sample config

input {
  jdbc {
    jdbc_driver_library => "logstash-1.5.0/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@//"
    jdbc_user => ""
    jdbc_password => ""

    statement => "SELECT to_char(creation_time,'dd-MON-YYYY HH24.MI.SSxFF') as logStoreTime ,  from log where to_char(creation_time,'dd-MON-YYYY HH.MI.SSxFF AM') > :sql_last_start"
    record_last_run => true
    last_run_metadata_path => "/.logstash_jdbc_last_run"
    type => "log"
  }
}
filter {
  date {
    locale => "en"
    timezone => "UTC"
    match => [
      "logStoreTime" , "dd-MMM-YYYY HH:mm.ss.SSSSSS"
    ]
  }
}

And my output is looking like this

{
    "logstoretime" => "04-AUG-2015 18.33.07.980000",
        "@version" => "1",
      "@timestamp" => "2015-08-05T14:50:24.981Z",

}
{
    "logstoretime" => "04-AUG-2015 18.34.14.747000",
        "@version" => "1",
      "@timestamp" => "2015-08-05T14:50:24.985Z",

}
{
    "logstoretime" => "04-AUG-2015 18.31.41.644000",
        "@version" => "1",
      "@timestamp" => "2015-08-05T14:50:24.989Z",

}

The @ version is UTC, so you need to add your TZ offset to it.

I tried to update the timezone still giving me errors. Now I suspect its something wrong with the format. I tried to include a timezone and match it with the date but it did not catch on. So then I tried to catch it with a grok statement but its giving me an error even though I tested the pattern and its a 99% match:

input {
  jdbc {
    jdbc_driver_library => "logstash-1.5.0/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@//"
    jdbc_user => ""
    jdbc_password => ""

    statement => "SELECT to_char(creation_time,'YYYY-MM-dd HH24:MI:SSxFF TZR') as logStoreTime from logwhere creation_time > :sql_last_start"
    record_last_run => true
    last_run_metadata_path => "/.logstash_jdbc_last_run"
    type => "log"
  }
}

filter {
  grok {
    match => {"logStoreTime" => "(?<logStoreTime>\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{6} \+\d{2}:\d{2})"}
  }
    # date {
    #   locale => "en"
    #   timezone => "GMT"
    #   match => [
    #     "logStoreTime" , "YYYY-MM-dd HH:mm:ss.SSSSSS Z"
    #   ]
    # }

}

I commented out the date part after I tried to move around the date format like MM-YYYY-dd even when I did not change the select format and no errors occurred. So I think now it has to be something with the data type because grok is not matching. Here is some more infor

 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
CREATION_TIME                             NOT NULL TIMESTAMP(6)

my output from this filter currently config (not date matching):

{
    "logstoretime" => "2015-08-04 18:34:14.747000 +00:00",
        "@version" => "1",
      "@timestamp" => "2015-08-06T12:52:15.850Z",
            "type" => "test",
            "tags" => [
        [0] "_grokparsefailure"
    ]
}
{
    "logstoretime" => "2015-08-04 18:34:18.661000 +00:00",
        "@version" => "1",
      "@timestamp" => "2015-08-06T12:52:15.854Z",
            "type" => "test",
            "tags" => [
        [0] "_grokparsefailure"
    ]
}

if I try to comment the grok statement. And just use date

{
    "logstoretime" => "2015-08-04 18:34:14.747000 +00:00",
        "@version" => "1",
      "@timestamp" => "2015-08-06T13:07:39.212Z",
            "type" => "test"
}
{
    "logstoretime" => "2015-08-04 18:34:18.661000 +00:00",
        "@version" => "1",
      "@timestamp" => "2015-08-06T13:07:39.214Z",
            "type" => "test"
}

Oracle 11g and logstash 1.5.1

Just tried to delete the YYYY-MM-dd field in the date filter so it only has time and similar output no errors... I thought it would at least say date not matching:

{
"logstoretime" => "2015-08-04 18:34:14.747000 +00:00",
"@version" => "1",
"@timestamp" => "2015-08-06T13:30:48.389Z",
"type" => "test"
}
{
"logstoretime" => "2015-08-04 18:34:18.661000 +00:00",
"@version" => "1",
"@timestamp" => "2015-08-06T13:30:48.392Z",
"type" => "test"
}

Is there a way to use this as the statement it works well in the river, but errors show up when I am using the plugin:

select creation_time as \"@timestamp\" from log;