Logstash 5.5 unable to locate config file in windows 10

Hi,
I am using logstash-5.5.0 in Windows 10.
Its unable to find config File which I have kept in logstash folder itself.
Below is the error which I am getting when I do "logstash -f logstash.conf":

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to C:/logstash-5.5.0/logs which is now configured via log4j2.properties
[2018-01-23T16:54:13,613][INFO ][logstash.agent ] No config files found in path {:path=>"C:/logstash-5.5.0/bin/logstash.conf"}
[2018-01-23T16:54:13,627][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: logstash.conf. Can you make sure this path is a logstash config file?"}
2018-01-23 16:54:13,721 Api Webserver ERROR No log4j2 configuration file found. Using default configuration: logging only errors to the console.

You aren't invoking Logstash from the /logstash directory it seems, you're in /logstash/bin directory and it's expected to not find the config file if it's under the /logstash directory.

Either use absolute paths for your logstash config (the full path), or move it inside the /bin folder (although the latter is not really a nice solution).

Yes.

@manisha_2018 You should make sure the path of config. BTW, you can add you logstash binary file path to environment variable so that you can use logstash command everywhere. and append -c <Full path of your logstash.conf>

# Added logstash bin folder into your environment variable, only below way can you run what you run before.
eason:logstash xy920$ pwd
/Users/eason/development/logstash
eason:logstash eason$ ls
bin		logstash.conf
eason:logstash eason$ logstash -c logstash.conf

I was able to get through this. I just pasted my config file in config folder & its extension was still .txt which I got to know through folder options .After changing that LOgstash is now able to find config file .
So here is the new problem : I am trying to connect to sql DB through Logstash and show Output in Elastic search and for that my config file is :
"input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
# The user we wish to execute our statement as
jdbc_user => ""
jdbc_password => "
***"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\logstash-5.5.0\mysql-connector-java-5.1.45"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM example"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "test-migrate"
"document_type" => "data"
}
}"

And I am getting this output which I am unable to understand.
"ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to C:/logstash-5.5.0/logs which is now configured via log4j2.properties
[2018-02-01T15:10:29,382][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-02-01T15:10:29,395][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-02-01T15:10:29,851][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#Java::JavaNet::URI:0x3ba0ab33}
[2018-02-01T15:10:29,854][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-02-01T15:10:29,998][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}}
[2018-02-01T15:10:30,061][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2018-02-01T15:10:30,892][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#Java::JavaNet::URI:0x1f4c39ef]}
[2018-02-01T15:10:30,905][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2018-02-01T15:10:33,577][INFO ][logstash.pipeline ] Pipeline main started
LoadError: no such file to load -- C:/logstash-5.5.0/mysql-connector-java-5.1.45
require at org/jruby/RubyKernel.java:1040
require at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65
load_drivers at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/plugin_mixins/jdbc.rb:134
each at org/jruby/RubyArray.java:1613
load_drivers at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/plugin_mixins/jdbc.rb:132
open_jdbc_connection at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/plugin_mixins/jdbc.rb:146
execute_statement at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/plugin_mixins/jdbc.rb:217
execute_query at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/inputs/jdbc.rb:272
run at C:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-4.2.1/lib/logstash/inputs/jdbc.rb:256
inputworker at C:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:456
start_input at C:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:449"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.