Unable to import to ES / Error occurred sending a bulk request to Elasticsearch

Hello community,

I'm desperately trying to import data into Elasticsearch with Logstash from MS SQL Server 2012R2 but it fails all the time with the following error:

[2019-08-30T10:57:11,347][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"\"\\x96\" from ASCII-8BIT to UTF-8", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["E:/Logstash/logstash-7.3.0/logstash-core/lib/logstash/json.rb:27:in `jruby_dump'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:2577:in `map'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:1792:in `each'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in `bulk'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:296:in `safe_bulk'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:201:in `submit'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:169:in `retrying_submit'", "E:/Logstash/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in `multi_receive'", "E:/Logstash/logstash-7.3.0/logstash-core/lib/logstash/java_pipeline.rb:239:in `block in start_workers'"]}

Here's the table that I'm querying from:
TableExample

Here's a row sample from the table:

I know that the issue come from the column "SysRevID". As soon as I remove this field from the query in Logstash, the script runs correctly.

The problem is that I need this field since it is the one who manage the version of the row in my database. If the number is different, it's because there was a change on the row and I need to update the document in ES.

Here's my config that I'm using with Logstash (I removed the settings that were not useful for this issue):

input {
	jdbc {
		statement => "SELECT * FROM [DATABASE].[dbo].[TEST]"
	}
}
output {
	elasticsearch {
		index => "logstash-ud08-%{+YYYY.MM.dd}"
	}
}

I tried different thing to convert / change the charset of the column but nothing work.

I'm quite new to the stack so I don't understand all the settings.

Here's the specs that I'm using:

  • ElasticSearch 7.3
  • Kibana 7.3
  • Logstash 7.3
  • SQL Server Collation Name : French_CI_AS

Finally, the question is.. what can I do to keep this field in my SELECT query?

Thanks!
Alexandre

From...
https://docs.microsoft.com/en-us/sql/connect/jdbc/understanding-data-type-differences?view=sql-server-2017#binary-string-types
I see that a MS Sqlserver timestamp data type maps to a JDBC Binary data type.

Then in this...


I see the same error message as you have and a solution.

2 Likes

Hi @guyboertje,

It works!

But I realize that I already try this solution before posting my question except that I writed the name of my column with capital letters "SysRevID" and not like this "sysrevid" the first time I tried...

Here's the final config:

input {
	jdbc {
		statement => "SELECT * FROM [DATABASE].[dbo].[TEST]"
	    columns_charset => { "sysrevid" => "UTF-8" }
	}
}
output {
	elasticsearch {
		index => "logstash-ud08-%{+YYYY.MM.dd}"
	}
}

Thanks for the links and the answer!
Have a nice week!
Alexandre

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.