Error when passing date to data filter


#1

Hi,

I am fetching rows from db using logstash jdbc input plugin. I have time entry processingtime in my db like;

03-01-17 03:32:50.006000000 PM

and in Logstash, I am using;

date {
	match => [ "processingtime", "dd-MM-yy HH:mm:ss.SSS" ]
	timezone => "UTC"
	target => "@timestamp"
}	

But, I am getting the error like;

[2017-01-24T14:36:58,576][WARN ][logstash.filters.date    ] Failed parsing date from field {:field=>"processingtime", :value=>2017-01-05T23:08:48.292Z, :exception=>"cannot convert instance of class org.logstash.ext.JrubyTimestampExtLibrary$RubyTimestamp to class java.lang.String", :config_parsers=>"dd-MM-yy HH:mm:ss.SSS aa", :config_locale=>"default=en_US"}

How can I fix this? Thanks.


(Mark Walkom) #2

Does not match

You've got 6 extra 0's on the end you need to cater for.


(Magnus Bäck) #3

You've got 6 extra 0's on the end you need to cater for.

...and AM/PM.


(Mark Walkom) #4

It's always AM/PM somewhere :wink:


#5

Hi @warkolm @magnusbaeck
I tried;

match => [ "processingtime", "dd-MM-yy HH:mm:ss.SSSSSSSSS AM/PM" ]

got error

[2017-01-24T15:26:07,208][ERROR][logstash.agent           ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/data/setup/GA5/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:297:in `setupMatcher'", "org/jruby/RubyArray.java:1613:in `each'", "/data/setup/GA5/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:224:in `setupMatcher'", "/data/setup/GA5/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:188:in `register'", "/data/setup/GA5/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:197:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/data/setup/GA5/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:197:in `start_workers'", "/data/setup/GA5/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:153:in `run'", "/data/setup/GA5/logstash-5.0.0/logstash-core/lib/logstash/agent.rb:250:in `start_pipeline'"]}

#6

Also, when I check the JSON (i have removed grok and inserted to elasticsearch), the time pattern is like 2017-01-03T07:40:26.188Z


(Magnus Bäck) #7
match => [ "txentrytime", "dd-MM-yy HH:mm:ss.SSSSSSSSS AM/PM" ]

No, don't literally put AM/PM there. Use the date pattern token for matching AM/PM, a. Then you should also use hh instead of HH. See http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html.


#8

Hi,

I tried;

match => [ "processingtime", "dd-MM-yy hh:mm:ss.SSSSSSSSS aa" ]

got error

[2017-01-24T15:39:42,487][WARN ][logstash.filters.date    ] Failed parsing date from field {:field=>"processingtime", :value=>2017-01-03T04:19:34.934Z, :exception=>"cannot convert instance of class org.logstash.ext.JrubyTimestampExtLibrary$RubyTimestamp to class java.lang.String", :config_parsers=>"dd-MM-yy hh:mm:ss.SSSSSSSSS aa", :config_locale=>"default=en_US"}

(Magnus Bäck) #9

Oh. Where does the timestamp come from, a jdbc input? If so, perhaps you can convert the timestamp to a string in the query? Otherwise I think the only option is to use a ruby filter to convert the timestamp to a string that the date filter can process.


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.