Pipeline error - special characters are not allowed in reader

Hello

Logstash 7.11.1 starting with errors and not creating indices. It writes pipeline error - special characters are not allowed in reader, but I don't know which file and it's worked from 2021 but now it's stopped with this error. Could sb help me?

[2024-06-05T00:01:41,270][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.11.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10 on 11.0.8+10 +indy +jit [mswin32-x86_64]"}
[2024-06-05T00:01:50,724][INFO ][org.reflections.Reflections] Reflections took 63 ms to scan 1 urls, producing 23 keys and 47 values 
[2024-06-05T00:01:54,023][INFO ][logstash.outputs.elasticsearch][Mysql] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2024-06-05T00:01:54,289][WARN ][logstash.outputs.elasticsearch][Mysql] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2024-06-05T00:01:54,367][INFO ][logstash.outputs.elasticsearch][Mysql] ES Output version determined {:es_version=>7}
[2024-06-05T00:01:54,382][WARN ][logstash.outputs.elasticsearch][Mysql] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2024-06-05T00:01:54,461][INFO ][logstash.outputs.elasticsearch][Mysql] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1"]}
[2024-06-05T00:01:54,710][INFO ][logstash.outputs.elasticsearch][Mysql] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2024-06-05T00:01:54,833][INFO ][logstash.outputs.elasticsearch][Mysql] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2024-06-05T00:01:54,951][INFO ][logstash.javapipeline    ][Mysql] Starting pipeline {:pipeline_id=>"Mysql", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/100_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/110_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/120_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/130_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/140_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/150_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/200_prefilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/300_filter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/310_filter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/800_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/801_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/802_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/802_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/803_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/804_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/890_dbColumnMappingFilter_Common.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/899_common_postfilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/900_elasticsearch_output.config"], :thread=>"#<Thread:0x2af62266 run>"}
[2024-06-05T00:01:58,104][INFO ][logstash.javapipeline    ][Mysql] Pipeline Java execution initialization time {"seconds"=>3.15}
[2024-06-05T00:01:58,699][ERROR][logstash.javapipeline    ][Mysql] Pipeline error {:pipeline_id=>"Mysql", :exception=>#<Psych::SyntaxError: (<unknown>): 'reader' unacceptable code point ' ' (0x0) special characters are not allowed
in "'reader'", position 0 at line 0 column 0>, :backtrace=>["org/jruby/ext/psych/PsychParser.java:250:in `parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:454:in `parse_stream'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:388:in `parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:277:in `load'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:118:in `read'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:51:in `common_set_initial'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:90:in `set_initial'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:34:in `initialize'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/value_tracking.rb:22:in `build_last_value_tracker'", "C:/ProgramData/Elastic/Logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/inputs/jdbc.rb:242:in `register'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in `register_plugins'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:386:in `start_inputs'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:311:in `start_workers'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in `run'", "C:/ProgramData/Elastic/Logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in `block in start'"], "pipeline.sources"=>["C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/100_cols_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/110_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/120_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/130_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/140_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/150_input.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/200_common_prefilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/300_filter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/310_filter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/800_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/801_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/802_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/802_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/803_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/804_dbColumnMappingFilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/890_dbColumnMappingFilter_Commonmmon.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/899_common_postfilter.config", "C:/ProgramData/Elastic/Logstash/config/pipelines/Mysql/900_elasticsearch_output.config"], :thread=>"#<Thread:0x2af62266 run>"}
[2024-06-05T00:01:58,714][INFO ][logstash.javapipeline    ][Mysql] Pipeline terminated {"pipeline.id"=>"Mysql"}
[2024-06-05T00:01:58,746][ERROR][logstash.agent           ] Failed to execute action {:id=>:Mysql, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<Mysql>, action_result: false", :backtrace=>nil}
[2024-06-05T00:01:59,277][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2024-06-05T00:02:04,109][INFO ][logstash.runner          ] Logstash shut down.
[2024-06-05T00:02:04,156][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
	at C_3a_.ProgramData.Elastic.Logstash.lib.bootstrap.environment.<main>(C:\ProgramData\Elastic\Logstash\lib\bootstrap\environment.rb:89) ~[?:?]

pipelines.yml

pipeline.id: Mysql
path.config: "config/pipelines/Mysql/*.config"

logstash.conf:

input  {
  beats  {
    port => 5044
  }
}

output  {
  elasticsearch  {
    hosts  =>  ["http://localhost:9200"]
    index  =>  "%{[metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

900_elasticsearch_output.config:

output  {
  elasticsearch  {
    index  =>  "index-%{YYYY_ww}"
  }
}

It's not creating indices from 900_elasticsearch_output.config file now.

Your logstash configuration configures a jdbc input. The input is trying and failing to read the input state from the last_run_metadata_path.

last_run_metadata_path is an option for the jdbc input. You haven't shown a jdbc input in your configuration, but you clearly have one.