Query to Link Server not working, getting a weird error

Hi, I am running a query that uses link server to get data from MS SQL db (joining with postgres in link server). The query runs fine when I run in MS SQL Server but doesn't work in logstash. I am getting this error:

[2018-05-08T16:35:37,359][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_driver_library=>"sqljdbc4.jar", jdbc_driver_class=>"com.microsoft.sqlserver.jdbc.SQLServerDriver", jdbc_connection_string=>, jdbc_user=>, jdbc_password=>, statement_filepath=>"query.sql", type=>"wwwproductsfeed", use_column_value=>true, tracking_column=>"start_date", tracking_column_type=>"timestamp", last_run_metadata_path=>"wwwproductsfeed_last_run", lowercase_column_names=>true, parameters=>{"sql_last_value"=>1969-12-31 16:00:00 -0800}, record_last_run=>true, clean_run=>true, id=>"acfd07b234827ccf7693feafba3289481c53c324e0f22e8fdcd1ae47461cd763", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_442777d7-5234-4c6e-8061-c7ebfc306a7f", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5>
Error: can't dup NilClass
Exception: TypeError
Stack: org/jruby/RubyKernel.java:1882:in dup' uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date/format.rb:838:in _parse'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date.rb:1830:in parse' /usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:87:in set_value'
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:237:in execute_statement' /usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:264:in execute_query'
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:250:in run' /usr/local/Cellar/logstash/6.2.4/libexec/logstash-core/lib/logstash/pipeline.rb:514:in inputworker'
/usr/local/Cellar/logstash/6.2.4/libexec/logstash-core/lib/logstash/pipeline.rb:507:in `block in start_input'

The config that I used:

input {
jdbc {
jdbc_driver_library => "sqljdbc4.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string =>
jdbc_user =>
jdbc_password =>
statement_filepath => "query.sql"
type => "wwwproductsfeed"
use_column_value => true
tracking_column => start_date
tracking_column_type => timestamp
last_run_metadata_path => "wwwproductsfeed_last_run"
# ------- following are default values --------
lowercase_column_names => true
parameters => { }
record_last_run => true
clean_run => true
}
}
filter {}
output {
if [type] == "wwwproductsfeed" {
csv {
csv_options => {"col_sep" => "|" }
fields => ["product_id", "name", "price", "recommendable", "link_url", "rating", "num_reviews", "brand", "sale_price", "start_date"]
path => "/tmp/product_%{+YYYY_MM_dd}.txt"
}
}
}

  • At the query level, I am converting the datetime to date format, which works fine in SQL Server but doesn't work when running the same query via logstash?
  • How can I add header in the generated CSV file using the output plugin for csv?

I cannot address your issue, but it looks like you removed the jdbc_connection_string in the config, but not the log, and obfuscated the password in the log, but not the config. If that is correct you might want to edit your post.

@Badger Thanks for your reply. I have intentionally removed connection-string, username and password from my post.

BUMP

Any idea why am i seeing this error?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.