[resolved] No AWS ELB logs in the elasticsearch

Stack Version: 6.6

I am not able to see logs in elasticsearch/kibana, even though they get successfully pasred in logstash.

In the logs, I can see that the logstash is able to get logs from the ELB. It also prints output received but I don't see it in the elasticsearch. First, I added template for filebeat and I am able to get data from syslogs. After adding ELB logs, I didn't see the logs in the ES so I created another index named elb-6.6.1 using same index_template as filebeat (thinking that may it needs new index, still no luck). It will be of great help if you can tell me what I am doing wrong. Does ELB needs new index or it should be able to use filebeat index?

logs removing actual data:
S3 input: Found key {:key=>"buket/prefix/s3_file.gz"}
Using default generated file for the sincedb
S3 input: Adding to objects {:key=>"buket/prefix/s3_file.gz"}
objects length is: {:length=>1}
...
...
S3 input: Download remote file {:remote_key=>"bucket/prefix/file.log", :local_filename=>"tmp_file.log"}
Processing file {:filename=>"tmp.log"}
filter received {"event"=>{"message"=>"log_string"}}
Running grok filter {:event=>#LogStash::Event:0x7c3d00e}
Event now: {:event=>#LogStash::Event:0x7c3d00e}
output received {"event"=>key,values}

Below are my configs:

input {
s3 {
bucket => "bucket"
prefix => "prefix"
region => "region"
type => "elblogs"
codec => plain
add_field => {
"[@metadata][type]" => "elb"
}
}
}

filter {
if [type] == "elblogs" {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb_name} %{IP:elb_client_ip}:%{NUMBER:elb_client_port:int} (?:%{IP:elb_backend_ip}:%{NUMBER:elb_backend_port:int}|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} (?:%{NUMBER:elb_status_code:int}|-) (?:%{NUMBER:backend_status_code:int}|-) %{NUMBER:elb_received_bytes:int} %{NUMBER:elb_sent_bytes:int} "(?:%{WORD:verb}|-) (?:%{GREEDYDATA:request}|-) (?:HTTP/%{NUMBER:httpversion}|-( )?)" "%{DATA:userAgent}"( %{NOTSPACE:ssl_cipher} %{NOTSPACE:ssl_protocol})?"]
match => ["message", "%{GREEDYDATA:event_name} for ELB: %{NOTSPACE:elb_name} at %{TIMESTAMP_ISO8601:timestamp}"]
}
if [elb_request] =~ /.+/ {
grok {
match => ["elb_request", "(?:%{WORD:http_method}) (?:%{DATA:http_path})? (?:%{DATA:http_type}/%{NUMBER:http_version:float})?|%{GREEDYDATA:rawrequest}"]
}
}
if [http_path] =~ /.+/ {
grok {
match => ["http_path", "(?:%{WORD:http_path_protocol}://)?(%{NOTSPACE:http_path_site}:)?(?:%{NUMBER:http_path_port:int})?(?:%{GREEDYDATA:http_path_url})?"]
}
}
geoip {
source => "elb_client_ip"
}
}
date {
match => [ "timestamp", "ISO8601" ]
}
}

output {
output {
elasticsearch {
hosts => ["ip:port"]
sniffing => true
manage_template => false
index => "elb-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}

This is resolved. I didn't see that data in the current timestamp as the elb logs were old and they were indexed at the log date. Now I can see the data in the kibana.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.