Way to send RDS mysql logs to Elastic

Hello Elastic Team,

I am working with filebeat's MySQL module for a long time.
but I can see that similar logs are generated at AWS rds also.

So after the 7.4 beat release(s3 as input), I want to ingest those rds logs to elastic.

I was already sending those rds logs to the s3 bucket by using some script.

Now my question is can I use existing filebeat MySQL modules with s3 as input.

I tried to send logs directly to mysql ingest pipeline using the below mentioned configuration.

output.elasticsearch:
hosts: ['http://elasticsearch-01.some.com:9200' ]

Optional protocol and basic auth credentials.

#protocol: "https"
username: "elastic"
password: "CgkIXXXXXXXXXX"
worker: 8

pipelines:
- pipeline: "filebeat-7.4.0-mysql-slowlog-pipeline"
when.contains:
aws.s3.object.key: "mysql-slowquery.log"
- pipeline: "filebeat-7.4.0-mysql-error-pipeline"
when.contains:
aws.s3.object.key: "mysql-error.log"

it looks like you can override the module input, can you try that using the mysql module and setting inside the s3 input?

https://www.elastic.co/guide/en/beats/filebeat/7.5/advanced-settings.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.