Logstash gives: Unknown setting '"bucket"' for s3 error

Referring to: Security for Elasticsearch is now free | Elastic Blog


  • Logstash.conf has:

input {
s3 {
"access_key_id" => ""
"secret_access_key" => "
"
"bucket" => "bucket_name"
"region" => "region"
"prefix" => "my_prefix/"
"type" => "type1"
}
}

filter {
if [type] == "type1" {
grok {
match => {
"message" => "[%{TIMESTAMP_ISO8601:time}.*] [%{NUMBER:pid}] [%{WORD:severity}] %{GREEDYDATA:log}"
}
}
}
}
output {
if [type] == "type1" {
elasticsearch {
hosts => ["my_ip:9200"]
manage_template => false
index => "type1-%{+YYYY.MM.dd}"
}

  • My pipelines.yml has:
  • pipeline.id: logstash
    path.config: "/home/ubuntu/logstash-7.1.1/config/logstash.conf"

Facing following error, attached the screenshot.

./bin/logstash
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/home/ubuntu/logstash-7.1.1/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.FileDescriptor.fd
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

Sending Logstash logs to /home/ubuntu/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-06-26T11:19:00,332][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-26T11:19:26,673][ERROR][logstash.inputs.s3 ] Unknown setting '"bucket"' for s3
[2019-06-26T11:19:26,675][ERROR][logstash.inputs.s3 ] Unknown setting '"additional_settings"' for s3
[2019-06-26T11:19:26,675][ERROR][logstash.inputs.s3 ] Unknown setting '"access_key_id"' for s3
[2019-06-26T11:19:26,676][ERROR][logstash.inputs.s3 ] Unknown setting '"type"' for s3
[2019-06-26T11:19:26,677][ERROR][logstash.inputs.s3 ] Unknown setting '"prefix"' for s3
[2019-06-26T11:19:26,677][ERROR][logstash.inputs.s3 ] Unknown setting '"region"' for s3
[2019-06-26T11:19:26,678][ERROR][logstash.inputs.s3 ] Unknown setting '"secret_access_key"' for s3
[2019-06-26T11:19:26,700][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:logstash, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["/home/ubuntu/logstash-7.1.1/logstash-core/lib/logstash/config/mixin.rb:86:in config_init'", "/home/ubuntu/logstash-7.1.1/logstash-core/lib/logstash/inputs/base.rb:60:in initialize'", "org/logstash/plugins/PluginFactoryExt.java:255:in plugin'", "org/logstash/plugins/PluginFactoryExt.java:117:in buildInput'", "org/logstash/execution/JavaBasePipelineExt.java:50:in initialize'", "/home/ubuntu/logstash-7.1.1/logstash-core/lib/logstash/java_pipeline.rb:23:in initialize'", "/home/ubuntu/logstash-7.1.1/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/home/ubuntu/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:325:in block in converge_state'"]}
[2019-06-26T11:19:27,303][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-06-26T11:19:32,039][INFO ][logstash.runner ] Logstash shut down.
strong text

Remove the quotes around the names of the options it should be

bucket => "bucket_name"
1 Like

Hey Badger, I removed the quotes, and now facing this error when I start the logstash.

Starting Logstash {"logstash.version"=>"7.1.1"}

[2019-06-26T14:06:40,162][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://logstash_system:xxxxxx@127.0.0.1:9200/]}}
[2019-06-26T14:06:40,779][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://logstash_system:xxxxxx@127.0.0.1:9200/"}
[2019-06-26T14:06:41,086][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-26T14:06:41,099][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-06-26T14:06:41,188][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["http://127.0.0.1:9200"]}
warning: thread "Ruby-0-Thread-5: :1" terminated with exception (report_on_exception is true):
> LogStash::Outputs::Elasticsearch::HttpClient::Pool::BadResponseCodeError: Got response code '403' contacting Elasticsearch at URL 'http://127.0.0.1:9200/logstash'
perform_request at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80
perform_request_to_url at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291
perform_request at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278
with_connection at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373
perform_request at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277
Pool at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285
exists? at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:341
rollover_alias_exists? at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:359
maybe_create_rollover_alias at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:91
setup_ilm at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:10
setup_after_successful_connection at /home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:52
[2019-06-26T14:06:41,645][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Outputs::Elasticsearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::Elasticsearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in Pool'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:341:in `exists?'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:359:in `rollover_alias_exists?'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:91:in `maybe_create_rollover_alias'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:10:in `setup_ilm'", "/home/ubuntu/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:52:in `block in setup_after_successful_connection'"]}
[2019-06-26T14:06:41,897][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"logstash", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x524ca88e run>"}
[2019-06-26T14:06:41,898][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

403 is 'Forbidden'. Sounds like you need to supply a username and password to the elasticsearch filter.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.