Logstash Error with JDBC plugin

Hi, I used logstash with elasticsearch for a few days now, everything was working fine: with logstash, I was querying a MySQL table and indexing it in elasticsearch to visualize it in Kibana. Then, logstash stopped working suddenly. When running
// \bin\logstash.bat -f simple_config.conf
I started getting these errors :
//[ERROR][logstash.javapipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<TypeError: no implicit conversion of Integer into String>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date/format.rb:335:in _parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date.rb:734:inparse'", "C:/ELK/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:87:in set_value'", "C:/ELK/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:36:ininitialize'", "C:/ELK/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:29:in build_last_value_tracker'", "C:/ELK/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:216:inregister'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:192:in block in register_plugins'", "org/jruby/RubyArray.java:1792:ineach'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:191:in register_plugins'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:292:instart_inputs'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:248:in start_workers'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:146:inrun'", "C:/ELK/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:105:in `block in start'"], :thread=>"#<Thread:0x3e6f40c3 run>"}

I uninstall all the elastic stack, MySQL, I changed the query to another one, I even tried to simply output it using stdout, but I always get an error.

Here is my configuration file:
// input {
jdbc {
jdbc_driver_library => "mysql-connector-java-8.0.11.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/crretriever"
jdbc_user => "xxxx"
jdbc_password => "xxxx"
statement => "SELECT * FROM allbuilds;"
}
}

output {
stdout {
codec => json
}
}

Thanks for your help

The clue (very obscure) is value_tracking.rb:36:ininitialize'".

The jdbc input can track the last value used as the sql_last_value. You can specify the path to the file but if you don't then LS stores it in $HOME via:

config :last_run_metadata_path, :validate => :string, :default => "#{ENV['HOME']}/.logstash_jdbc_last_run"

The config that you show above means that the jdbc input is expecting to track by Time. The value read from the default local is an integer. The Time value tracker is expecting parse a serialised Time string value and it fails because it can't parse an integer. You must have run LS with a config that tracking by numeric in the past.

This is a bug and I am fixing it. The workaround is to delete the default file and specify a path that you can see in the config. This path should ideally be named with the intended value tracking type in mind - so while you are experimenting with the different settings you can control exactly which value type is used.

Exactly. By debugging, I understood that this came from a previous configuration file that was indeed tracking time. I first deleted the .logstash_jdbc_last_run, but this gave me another error. I then uninstalled logstash and deleted all the files in \temp\ and then I was able to use logstash normally.
Where can I specify a file to store the sql_last_value? And if I'm tracking dates, how should I name my path? Thank you.

Use the setting last_run_metadata_path in the jdbc input config section.

Naming: I would suffix with -ts if using timestamp and -num if using numeric

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.