I try to get cloudtrail log from AWS by filebeat.
Scenario 1: I use elastic search output in filebeat, there are many enrichment fields such as geo, ... in the cloudtrail index.
Scenario 2: I use log stash output in filebeat and from log stash I output to elastic search, there are much fewer fields.
I want to use logstash between filebeat and elasticsearch (scenario 2) but the data is insufficient.
Please help me to solve this problem.
Thank you.
Hi @TuongBma.
I have the same problem. Look like a pipeline problem, but I'm not sure.
To fix it I did:
- Exclude the filebeat indices from Elasticsearch
- Set my
filebeat.yml
with Elasticsearch and Kibana output - Ran
filebeat setup
to create indice, pipeline, dashboards, etc. at the Elastic and Kibana - Set my
filebeat.yml
[1] with logstash output and remove Elastic and Kibana output. (My config have a fields set, but was only to debug) - Set my logstash config[2] to output the logs to Elastic
I don't know if this is the best practice or solution, but worked for me.
[1] filebeat.yml:
filebeat.inputs:
- type: s3
queue_url: SQS_HTTP_ADDRESS
expand_event_list_from_field: Records
enabled: true
fields:
tipo_log: "cloudtrail"
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
setup.template:
name: "%{[fields.tipo_log]}"
pattern: "%{[fields.tipo_log]}-*"
settings:
index.number_of_shards: 1
output.logstash:
enabled: true
hosts: ["LOGSTASH_ADDRESS"]
[2] Logstash config
input {
beats {
port => 5044
}
}
output {
elasticsearch {
enable_metric => false
hosts => ["ELASTIC_ADDRES"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
#index => "filebeat-7.9.0-2020.09.10-000001"
user => "ELASTIC_USER"
password => "ELASTIC_PASS"
pipeline => "filebeat-7.9.0-aws-cloudtrail-pipeline"
}
}
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.