Error using date match filter

Hi everyone,

I keep getting an error while trying to make my timestamp ("time") field from my MS SQL database the @timestamp field using the date match filter in logstash.

This is my input i'm using for my timestamp:
2015-06-02 17:13:51.331 +00:00

This is my filter i'm using in my conf file

`filter {
     date {
  match => ["time","YYYY-MM-dd HH:mm:ss.SSS"]
  locale => "en"
  target => "@timestamp"
 }
}`

This filter produces the error:
Side note: I get this error also using the match pattern
match => ["time","YYYY-MM-dd HH:mm:ss Z"],
match => ["time","YYYY-MM-dd HH:mm:ss"]

Failed parsing date from field {:field=>"time", :value=>#Java::MicrosoftSql::DateTimeOffset:0xe9c298, :exception=>"failed to coerce microsoft.sql.DateTimeOffset to java.lang.String", :config_parsers=>"YYYY-MM-dd HH:mm:ss.SSS", :config_locale=>"en", :level=>:warn}

I'm pretty new to logstash and the ELK stack in general and to me it appears to be an issue with the data type itself being exported by the MS SQL database but i'm not sure. If anyone knows what's going wrong here or has any ideas how to fix it I would love some help.

Thanks for any help!

You should just be able to use TIMESTAMP_ISO8601 as the pattern.

Hi,

Are you using any kind of grok pattern to extract the data?

I had this issue too and got around it by first converting the sql date field to a string, then running the date filter:

filter {
     mutate {
        convert => [ "time", "string" ]
     }
        
     date {
         match => [ "time", ISO8601 ]
         target => "@timestamp"
     }

     mutate {
        remove_field => [ "time" ]
     }
}
1 Like

Thank you @mick66! This resolved the MSSQL time issue for me as well.

I'm using the JDBC driver to harvest time series data from a SolarWinds Orion database (pulling in triggered alerts) for dashboarding and analysis in Kibana. Your example of having to convert the SQL date field to a string first is EXACTLY what I needed to know for my solution. Thanks!

1 Like