Hello Elastic Team,
I am working with filebeat's MySQL module for a long time.
but I can see that similar logs are generated at AWS rds also.
So after the 7.4 beat release(s3 as input), I want to ingest those rds logs to elastic.
I was already sending those rds logs to the s3 bucket by using some script.
Now my question is can I use existing filebeat MySQL modules with s3 as input.
I tried to send logs directly to mysql ingest pipeline using the below mentioned configuration.
output.elasticsearch:
hosts: ['http://elasticsearch-01.some.com:9200' ]
Optional protocol and basic auth credentials.
#protocol: "https"
username: "elastic"
password: "CgkIXXXXXXXXXX"
worker: 8
pipelines:
- pipeline: "filebeat-7.4.0-mysql-slowlog-pipeline"
when.contains:
aws.s3.object.key: "mysql-slowquery.log"
- pipeline: "filebeat-7.4.0-mysql-error-pipeline"
when.contains:
aws.s3.object.key: "mysql-error.log"