I have a working function beat sending logs from Cloudwatch containing mysql slow query data ES and am able to view the log data in Kibana.
I am trying to understand a few things regarding various log types.
- What is the best practice for dealing with multiple log groups having different formats. Ex: mysql (RDS) slow query, error, AWS ELB, S3, etc. Do we have one function with multiple streams, or a function per stream?
- Related to above, how do we then handling the different parsing requirements for each log group. Is this a general Beats issue where we'd setup a single pipeline, and then have multiple processesors for each expected log type?
- Am I correct in assuming that even though Kibana and Filebeats has support for Mysql, that because we are getting these logs from functionbeat, we can't use the pre-built dashboards?