AWS S3 bucket logs to Elastic search service (AWS) through log stash

  • I set up log stash on EC2 Linux (AWS) and Elastic search service on AWS

  • and i have created one separate file (with input and output) under /etc/logstash/conf.d path

  • but I am not getting logs in Kibana

  • and connection has been established between two nodes successfully
    any suggestion to check where I am missing this?

It would help if you showed us your config.

my config file :
input {
s3 {
#"access_key_id" => "your_access_key_id"
#"secret_access_key" => "your_secret_access_key"
"region" => "us-east"
"bucket" => "my bucket_name"
#"prefix" => "Logs"
#"interval" => "10"
#"additional_settings" => {
#"force_path_style" => true
#"follow_redirects" => false
}
}
}

output {
elasticsearch {
hosts => ["https://xxxxxxx.es.amazonaws.com:443"]
index => "logs-test"
#user => "elastic"
#password => "changeme"
}
}

i am getting error from /var/log/logstash/logstash-plain.log

[2019-09-18T01:39:32,179][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 14, column 1 (byte 368) after ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2577:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:ininitialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:24:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:inblock in converge_state

This error is because you have a closing curly brace related to the additional settings block at the end of the input section that you have not commented out. I have never used AES ES, but recall seeing some users needing to use the amazon_es output plugin as well.

i have tweaked my config file a bit and it looks like below

input {
s3 {
#"access_key_id" => "your_access_key_id"
#"secret_access_key" => "your_secret_access_key"
"region" => "us-east"
"bucket" => "xxxxxxxx-log"
#"prefix" => "Logs"
#"interval" => "10"
}
}

output {
elasticsearch {
hosts => ["https:xxxxxxx.amazonaws.com:443"]
index => "s3logs-*"
#user => "elastic"
#password => "changeme"
}
}

at this time i am not getting any errors except below
[2019-09-18T19:49:45,476][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.2"}
[2019-09-18T19:50:25,895][INFO ][org.reflections.Reflections] Reflections took 1881 ms to scan 1 urls, producing 19 keys and 39 values
[2019-09-18T20:35:07,880][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.2"}
[2019-09-18T20:35:09,864][INFO ][org.reflections.Reflections] Reflections took 105 ms to scan 1 urls, producing 19 keys and 39 values
[2019-09-18T20:50:13,114][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.2"}
[2019-09-18T20:50:16,230][INFO ][org.reflections.Reflections] Reflections took 109 ms to scan 1 urls, producing 19 keys and 39 values

but still i have no idea what i am missing here !!

I am directly using S3 plug-in in log stash
Do i need to enable S3 I/P and O/P plug in before utilizing it.

finally i ended up with positive result , now i successfully shipped S3 bucket logs through Logasth to AWS ES service

my input and o/p config files as below:
I/P config file:

input {
s3 {
bucket => ""
region => "us-east-1"
prefix => "<your prefix(do not include bucket name in prefix)>"
}
}

O/P config file:
output {
amazon_es {
hosts => ["vpcxxxxxxx.es.amazonaws.com"]
index => "test"
aws_access_key_id => ''
aws_secret_access_key => ''
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.