"\"\\xF8\" from ASCII-8BIT to UTF-8", :error_class=>"LogStash::Json::GeneratorError"

Hello Team,

I am getting below message while running JDBC Connection against MS-SQL DB, Could you please help

Error Message:
[ERROR] 2020-12-14 23:55:12.404 [[main]>worker0] elasticsearch - An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>""\xF8" from ASCII-8BIT to UTF-8", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:43:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1809:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:386:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:289:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:257:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:36:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:138:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:293:in `block in start_workers'"]}

Configuration:
input {
jdbc {
jdbc_driver_library => "/etc/logstash/conf.d/mssql-jdbc-8.4.1.jre11.jar"
jdbc_connection_string => "jdbc:sqlserver://...:1433;databaseName=testdb"
jdbc_user => ""
jdbc_password => "
"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT * FROM sys.fn_get_audit_file ('C:\SQL_audit*.sqlaudit',default,default)"
schedule => "* * * * *"
}
}

Using below setup
Elastic stack 7.9.2
MS-SQL 2017 Evaluation

Tried couple of methods from online which did not resolved the issue.

removed couple of fields from logstash using mutate which seems to be containing some random text, but no luck.

To use columns_charset no idea which column is causing this and I checked the DB did not see any columns with ASCII characters in its name

Thank you in advance.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.