Logstash 5.1.2 starting with errors

I'm new in logstash and elasticsearch.
i want to index mysql database to elasticsearch using logstash. i downloded the last versions 5.1.2 (of logstash and elasticsearch).
elasticsearch start without problems.
for logstash, i downloaded mysql connector jar, and put it in bin folder, i also created a file with "logstash.conf" in bin folder :
input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.38.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/ecommerce"
jdbc_user => "root"
jdbc_password => "root"
schedule => "* * * * *"
statement => "select * from products"
output {
#stdout { codec => json_lines }
elasticsearch {
index => "products"
document_type => "product"
document_id => "%{ProductID}"
hosts => "[localhost:9200]"


when i start logstash with logstash -f logstash.conf, i get this errors, and my database not indexed :disappointed_relieved:

Could not find log4j2 configuration at path /MYENV/outils/elastic512/logstash51
2/config/log4j2.properties. Using default config which logs to console
15:40:42.816 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted d
ue to error {:exception=>#<Psych::SyntaxError: (): 'reader' unacceptabl
e character ' ' (0x0) special characters are not allowed
in "'reader'", position 0 at line 0 column 0>, :backtrace=>["org/jruby/ext/psych
/PsychParser.java:232:in parse'", "C:/MYENV/outils/elastic512/logstash512/vend or/jruby/lib/ruby/1.9/psych.rb:375:inparse_stream'", "C:/MYENV/outils/elastic
512/logstash512/vendor/jruby/lib/ruby/1.9/psych.rb:323:in parse'", "C:/MYENV/o utils/elastic512/logstash512/vendor/jruby/lib/ruby/1.9/psych.rb:250:inload'",
nput-jdbc-4.1.3/lib/logstash/inputs/jdbc.rb:206:in register'", "C:/MYENV/outil s/elastic512/logstash512/logstash-core/lib/logstash/pipeline.rb:353:instart_in
puts'", "org/jruby/RubyArray.java:1613:in each'", "C:/MYENV/outils/elastic512/ logstash512/logstash-core/lib/logstash/pipeline.rb:352:instart_inputs'", "C:/E
n start_workers'", "C:/MYENV/outils/elastic512/logstash512/logstash-core/lib/l ogstash/pipeline.rb:183:inrun'", "C:/MYENV/outils/elastic512/logstash512/logs
tash-core/lib/logstash/agent.rb:292:in `start_pipeline'"]}
15:40:42.915 [Api Webserver] INFO logstash.agent - Successfully started Logstas
h API endpoint {:port=>9600}
15:40:45.816 [LogStash::Runner] WARN logstash.agent - stopping pipeline {:id=>"

have you any ideao about this problem ?

i have jdk1.8.0_91 on my machine.


this error will make me crazy.
i tested with version 2.2.0 and i had the same error but with less messages :

C:\MYENV\esproject\logstash-2.2.0\bin>logstash -f logstash.mysql.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 8
The error reported is:
(): 'reader' unacceptable character ' ' (0x0) special characters are
not allowed
in "'reader'", position 0 at line 0 column 0

is there a problem in my config file ?

It looks like $USER_HOME/.logstash_jdbc_last_run exists but isn't valid YAML. Fix what's broken about the file, or perhaps just delete it?

1 Like

thanks magnus for your reply,
where can i find .logstash_jdbc_last_run ?

It should be in your home directory. I don't know why the documentation mentions $USER_HOME. It should say $HOME.

thanks for the reply magnusbaeck.
what should i modify in $HOME ?
i returned to the version 2.4.1 and my problems are solved.i think it is more stable than the 5.1.2

what should i modify in $HOME ?

As I said, fix what's wrong in .logstash_jdbc_last_run (the error message suggests broken YAML) or delete the file.

thanks again for the reply magnusbaeck.
i searched logstash_jdbc_last_run file, find it in (c:/users/user), and i deleted it. it s OK :sob:
i have 5 records in my table products, when i go to : http://localhost:9200/products/product/_search
i have just the last record as result:

any idea please ?

That's because the document id is "%{ProductID}". The event has no ProductID field, but it does have a field named productid.

thanks magnusbaeck.
it s OK

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.