Dear experts:
I'm using the jdbc input to load data from a oracle table to ES. There is a blob field in the table. If the blob field only have text content, it can transfer to ES. But when this filed have some binary data, such as PDF file, it will failed. The error log as below:
[2018-07-12T17:56:54,720][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_connection_string=>"jdbc:oracle:thin:@192.168.0.1:1521:oracle", jdbc_user=>"oracle", jdbc_password=>, jdbc_driver_library=>"/data/logstash/ojdbc7.jar", jdbc_driver_class=>"Java::oracle.jdbc.driver.OracleDriver", columns_charset=>{"msg_bytes"=>"UTF-8"}, record_last_run=>true, use_column_value=>false, tracking_column=>"LOG_TIME", last_run_metadata_path=>"/data/logstash/logstash-6.3.1/last_run", clean_run=>false, jdbc_paging_enabled=>true, jdbc_page_size=>50000, jdbc_fetch_size=>5000, jdbc_validate_connection=>true, connection_retry_attempts=>5, statement_filepath=>"/data/logstash/jdbc_test.sql", add_field=>{"key_id"=>"%{msg_id}%-%{direction}-%{log_location}", "log_bytes"=>""}, type=>"jdbc", id=>"cbe228beb0a0f45de90c1aa4976decf0effd7c4a323fe0f7f209ffd03f0d1d2e", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_4678de54-517f-4429-ae4f-2e6f34b5cc74", enable_metric=>true, charset=>"UTF-8">, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>2018-07-12 17:51:16 +0800}, tracking_column_type=>"numeric", lowercase_column_names=>true>
Error: private method warn' called for nil:NilClass Exception: NoMethodError Stack: /data/logstash/logstash-6.3.1/logstash-core/lib/logstash/util/charset.rb:28:in
block in convert'
org/jruby/RubyKernel.java:1741:in tap' /data/logstash/logstash-6.3.1/logstash-core/lib/logstash/util/charset.rb:27:in
convert'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:288:in convert' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:267:in
block in execute_query'
org/jruby/RubyHash.java:1343:in each' org/jruby/RubyEnumerable.java:830:in
map'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:267:in block in execute_query' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:231:in
block in execute_statement'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:253:in block in perform_query' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/dataset/actions.rb:151:in
block in each'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:795:in process_result_set' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:726:in
block in fetch_rows'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:244:in block in execute' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:675:in
statement'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:239:in block in execute' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/connection_pool/threaded.rb:91:in
hold'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/database/connecting.rb:270:in synchronize' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:238:in
execute'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/dataset/actions.rb:1082:in execute' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/adapters/jdbc.rb:726:in
fetch_rows'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/dataset/actions.rb:151:in each' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:252:in
block in perform_query'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/extensions/pagination.rb:60:in block in each_page' org/jruby/RubyRange.java:485:in
each'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/sequel-5.9.0/lib/sequel/extensions/pagination.rb:60:in each_page' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:251:in
perform_query'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:229:in execute_statement' /data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:264:in
execute_query'
/data/logstash/logstash-6.3.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:250:in run' /data/logstash/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:512:in
inputworker'
/data/logstash/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:505:in `block in start_input'
Who can kindly help me to solve this problem?
Or who can tell me how to configure it so that it can continue on error, not retry on error.