Logstash 5.0 configuration.invalid_plugin_register error in logs when using date filter

hi there,

I have the following config for logstash:

`input {
  file {
      path => "/csvfiles/*.csv"
      type => "agents"
      start_position => "beginning"
      sincedb_path => "/csvfiles/dbfile"
  }
}

filter {
csv {
    columns => ['Listingid', 'Listingdate',  'Status', 'Recorddate']
    separator => ","
}

date{
add_field => { "foo" => "Hello world" }
}

}

output {
  elasticsearch {
hosts => ["localhost:9200"]
document_type => "node_points"
template => "/etc/logstash/agent_template.json"
index => "agent_simple"
}
}`

Without the date element under filter everything works fine, when I add that I get the following in the logs:

[2016-12-06T16:27:55,746][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:175:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:197:in start_workers'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:197:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:153:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:250:in start_pipeline'"]}

I have tried modifying the date field, wrapping it in its own filter parent value but always get the same error in the logs.

I am using version 5.0 of the stack on Ubuntu.

Any help greatly appreciated.

I'm not sure what you're trying to do with your date filter, but the match option is mandatory. Here's the error you should've seen in the log:

The match setting should contains first a field name and at least one date format, current value is ...

Hi Magnus,

I started out with the intention of setting the Recorddate field to be the @timestamp BUT in order to debug the issues had resorted to one of the the samples from the logstash documentation while I get it working, to reduce the complexity.

Will include the match setting and see how it goes. There was no evidence relating to match setting being mandatory in the logs.

Thanks

Hi Magnus

Managed to get further now with:

filter {
date{
    match => ["Recorddate", "dd-MMM-yyyy HH:mm:ss"]
}
}

In the logs now I get:

Failed parsing date from field {:field=>"Recorddate", :value=>"Recorddate", :exception=>"Invalid format: "Recorddate"", :config_parsers=>"dd-MMM-yyyy HH:mm:ss", :config_locale=>"default=en_US"}

Yet the format I have provided I think it correct.

Recorddate looks like this:

"15-Nov-2016 23:49:01"

Thanks again.

Hi Magnus,

Have added a locale => "en" but still receive the same date formatting error.

Any help appreciated.

Can I reach out on this issue Magnus please?

All the best

The error message suggests that the parsing fails because the Recorddate field contains the string "Recorddate", which isn't consistent with what you're saying. If Recorddate really contains "15-Nov-2016 23:49:01" I don't know what's up.

Thats great Magnus thank you. I suspect whats happening is the csv parser ignores the first line that contains the csv column names, whereas this filter doesn't and so reads an incorrect value.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.