Logstasch pipe line error

Hi There
I am facing a problem each time I start logstash , below are the errors, Can you support

Sending Logstash logs to C:/ELK_STACK/logstash-6.4.0/logs which is now configured via log4j2.properties
[2018-10-02T09:12:34,755][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-10-02T09:12:35,505][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-10-02T09:12:38,003][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-02T09:12:38,395][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2018-10-02T09:12:38,411][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2018-10-02T09:12:38,603][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2018-10-02T09:12:38,658][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-02T09:12:38,664][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-10-02T09:12:38,700][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2018-10-02T09:12:38,720][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-02T09:12:38,769][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-02T09:12:39,253][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Jdbc jdbc_user=>"postgres", schedule=>"*/1 * * * *", add_field=>{"origin"=>"sms_1"}, jdbc_validate_connection=>true, jdbc_password=>, statement=>"select * from a2p where date like '2018-08-01 01%'; ", jdbc_driver_library=>"C:\\ELK_STACK\\postgresql-42.2.5.jre6.jar", jdbc_connection_string=>"jdbc:postgresql://localhost:5433/smsc", id=>"da27f63e923d60e18a6dd65f6cdbe33e38c64bb2563e82bd37391c5bf140cdbc", jdbc_driver_class=>"org.postgresql.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_89389bf1-15d8-4efb-9694-64881409eea3", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, last_run_metadata_path=>"C:\\Users\\IOT & M2M/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>", :error=>"(): 'reader' unacceptable code point '\u0000' (0x0) special characters are not allowed\nin "'reader'", position 0 at line 0 column 0", :thread=>"#<Thread:0x34b014ef run>"}
[2018-10-02T09:12:39,704][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Psych::SyntaxError: (): 'reader' unacceptable code point ' ' (0x0) special characters are not allowed
in "'reader'", position 0 at line 0 column 0>, :backtrace=>["org/jruby/ext/psych/PsychParser.java:231:in parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:377:inparse_stream'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:325:in parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:252:inload'", "C:/ELK_STACK/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.11/lib/logstash/plugin_mixins/value_tracking.rb:106:in read'", "C:/ELK_STACK/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.11/lib/logstash/plugin_mixins/value_tracking.rb:80:inget_initial'", "C:/ELK_STACK/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.11/lib/logstash/plugin_mixins/value_tracking.rb:36:in initialize'", "C:/ELK_STACK/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.11/lib/logstash/plugin_mixins/value_tracking.rb:29:inbuild_last_value_tracker'", "C:/ELK_STACK/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.11/lib/logstash/inputs/jdbc.rb:216:in register'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:241:inregister_plugin'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:252:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:252:in register_plugins'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:395:instart_inputs'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:293:in start_workers'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:199:inrun'", "C:/ELK_STACK/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:159:in `block in start'"], :thread=>"#<Thread:0x34b014ef run>"}
[2018-10-02T09:12:39,731][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2018-10-02T09:12:40,075][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

looks like something is wrong with either the query or the result of the query.

Query is working fine, I just applied on DB and working fine
Do you have any other suggestion ?

Not really, I have never used jdbc.

This link might help, it mentions the same error.

It probably makes it easier for someone to help if you show your configuration.

Hi Christian

Attached is the configuration file

Any one can support?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.