Contention of logstash in sql server

good morning
I have a detail with a connection of database sql server to logstash earlier this one worked correctly but when changing the table for the query this mara the following error:

[2017-09-11T12: 24: 05,936] [ERROR] [logstash.pipeline] Error al registrar el complemento {: plugin => "" C: \ ELK \ logstash-5.5.2 \ driver \ sqljdbc_6.2 \ enu \ mssql-jdbc-6.2.1.jre8 \ ", jdbc_driver_class => " com.microsoft.sqlserver.jdbc.SQLServerDriver \ ", jdbc_connection_string => " jdbc: sqlserver: // localhost: 1433; databaseName = gestion_documental_dev \ ", jdbc_user => " sa \ ", jdbc_password =>, instrucción => " SELECT * FROM Fondo \ ", jdbc_paging_enabled => true, jdbc_page_size => 50000, id => " 8160dda28248e023f005898ec3c93a762767f1c7-1 \ , enable_metric => true, codec => \ "plain_f66905f9-2cff-4d13-b867-6730dd968f15 ", enable_metric => true, charset => \ "UTF-8 ">, jdbc_validate_connection => false, jdbc_validation_timeout => 3600,jdbc_pool_timeout => 5, sql_log_level => \ "info ", connection_retry_attempts => 1, connection_retry_attempts_wait_time => 0.5, last_run_metadata_path => \ "C: \ Windows \ system32 \ config \ systemprofile / .logstash_jdbc_last_run ", use_column_value => false, tracking_column_type => \ "numeric ", clean_run => falso, record_last_run => true, lowercase_column_names => true> ",: error =>" \ nen \ "lector ", posición 0 en la línea 0 columna 0 "} [2017-09-11T12: 24: 06,670] [ERROR] [logstash.agent] Pipeline abortado debido a un error {: excepción => #): 'caracteres lectores' inaceptables '' (0x0) caracteres especiales no están permitidos en "'lector ', posición 0 en la línea 0 columna 0>,: backtrace => ["org / jruby / ext / psych / PsychParser.java: 232: inparse'",
"C:/ELK/logstash-5.5.2/vendor/jruby/lib/ruby/1.9/psych.rb:375:in
parse_stream '"," C: /ELK/logstash-5.5.2/vendor/jruby/lib/ruby/1.9/psych.rb: 323: en parse'",
"C:/ELK/logstash-5.5.2/vendor/jruby/lib/ruby/1.9/psych.rb:250:in
carga' "," C: /ELK/logstash-5.5.2/vendor /bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.2/lib/logstash/inputs/jdbc.rb:222:in register'",
"C:/ELK/logstash-5.5.2/logstash-core/lib/logstash/pipeline.rb:281:in
register_plugin '"," C: /ELK/logstash-5.5.2/logstash- core / lib / logstash / pipeline.rb: 292: en register_plugins'", "org/jruby/RubyArray.java:1613:incada "", "C: /ELK/logstash-5.5.2/logstash-core/lib/logstash/pipeline.rb: 292: in register_plugins'",
"C:/ELK/logstash-5.5.2/logstash-core/lib/logstash/pipeline.rb:442:in
start_inputs '", "C : /ELK/logstash-5.5.2/logstash-core/lib/logstash/pipeline.rb: 336: en start_workers'",
"C:/ELK/logstash-5.5.2/logstash-core/lib/logstash/pipeline.rb:226:in
ejecución '"," C: /ELK/logstash-5.5.2/logstash-core/lib/logstash/agent .rb: 398: en `start_pipeline '"]}

It would be easier to help if the messages weren't in Spanish, but it looks like there's garbage data in the .logstash_jdbc_last_run file. If you've changed the source table and the data in the file isn't usable anymore you can just delete the file.

Thank you very much, the problem was solved

I have a doubt for all my database is in elaticsearch that I can do?

I have a doubt for all my database is in elaticsearch that I can do?

You could e.g. use the Elasticsearch count API to count the number of docs in the index. Hopefully it matches the number of rows in your database.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.