Code:
input{
cloudwatch {
access_key_id => "...id..."
secret_access_key => "...key..."
namespace => "AWS/Logs"
metrics => ["IncomingBytes", "ForwardedBytes", "IncomingLogEvents", "ForwardedLogEvents"]
filters => {"tag:message" => "[Container]"}
region => "us-west-2"
}
}
output{
elasticsearch{
hosts => ["https://***.us-***.aws.found.io:9243"]
user => "..."
password => "..."
index => "cloudwatch"
}
stdout {
codec => rubydebug
}
}
output:
./bin/logstash -f config/pipelines/cloudwatchPipeline.conf
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/home/kourosh/Documents/logstash-7.0.1/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.FileDescriptor.fd
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /home/kourosh/Documents/logstash-7.0.1/logs which is now configured via log4j2.properties
[2019-05-28T11:53:11,903][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-28T11:53:11,910][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-05-28T11:53:14,230][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://elastic:xxxxxx@2f9166ddd14b4061847f4da62b35fe31.us-west-2.aws.found.io:9243/]}}
[2019-05-28T11:53:15,178][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@2f9166ddd14b4061847f4da62b35fe31.us-west-2.aws.found.io:9243/"}
[2019-05-28T11:53:15,384][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-05-28T11:53:15,385][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: thetype
event field won't be used to determine the document _type {:es_version=>7}
[2019-05-28T11:53:15,402][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["https://2f9166ddd14b4061847f4da62b35fe31.us-west-2.aws.found.io:9243"]}
[2019-05-28T11:53:15,413][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-05-28T11:53:15,425][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x28a2ea0f run>"}
[2019-05-28T11:53:15,483][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-05-28T11:53:15,516][INFO ][logstash.inputs.cloudwatch] Polling CloudWatch API
[2019-05-28T11:53:15,552][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-05-28T11:53:15,554][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-05-28T11:53:15,786][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-05-28T11:53:18,253][INFO ][logstash.inputs.cloudwatch] [Aws::CloudWatch::Client 200 1.430564 0 retries] list_metrics(namespace:"AWS/Logs")[2019-05-28T11:53:18,435][INFO ][logstash.inputs.cloudwatch] [Aws::CloudWatch::Client 200 0.089253 0 retries] get_metric_statistics(namespace:"AWS/Logs",metric_name:"IncomingBytes",start_time:2019-05-28 18:38:18 UTC,end_time:2019-05-28 18:53:18 UTC,period:300,statistics:["SampleCount","Average","Minimum","Maximum","Sum"],dimensions:[{name:"tag:message",value:"[FILTERED]"}])
[2019-05-28T11:53:18,485][INFO ][logstash.inputs.cloudwatch] [Aws::CloudWatch::Client 200 0.043631 0 retries] get_metric_statistics(namespace:"AWS/Logs",metric_name:"IncomingLogEvents",start_time:2019-05-28 18:38:18 UTC,end_time:2019-05-28 18:53:18 UTC,period:300,statistics:["SampleCount","Average","Minimum","Maximum","Sum"],dimensions:[{name:"tag:message",value:"[FILTERED]"}])
Question:
I don't receive any logs both in the terminal and in elasticsearch cloud. What is the problem? Personally, I think the problem is from filters. However, I don't know what to add or it.