Mssql data into elasticsearch using logstash

Hey Guys,

I have been trying to upload data from mssql database into elasticsearch using logstash.

Have succeeded with uploading some smaller table informations, when i try to upload tables like EVENTS which has around 10k records i'm getting error in the upload. here is the error information.

[ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>""\xCB" from ASCII-8BIT to UTF-8", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:28:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:118:inblock in bulk'", "org/jruby/RubyArray.java:2486:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:118:inblock in bulk'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:116:inbulk'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:243:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:157:insubmit'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:125:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:36:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:13:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:49:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:477:in block in output_batch'", "org/jruby/RubyHash.java:1343:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:476:in output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:428:inworker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:386:in `block in start_workers'"]}
[2018-06-12T17:43:00,192]

Not sure what is the issue here and where to start the troubleshooting, any advice would be deeply appreciated.

Thanks
Gautham

According to this:

I would say you have special characters in your database, that UTF-8 encoding doesn't recognize.

@AurelienG thanks for the response.

This database has all the details of my monitoring tool, i'm trying to get all the events generated there and create dashboards in kibana.

If it is not able to encode the special characters, is there any other way where we can proceed by skipping the special characters (OR) there is no use in continuing with this approach.

Thanks
Gautham

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.