Handling multiple logstream formats with Functionbeat

Hi,

I have a working function beat sending logs from Cloudwatch containing mysql slow query data ES and am able to view the log data in Kibana.

I am trying to understand a few things regarding various log types.

  1. What is the best practice for dealing with multiple log groups having different formats. Ex: mysql (RDS) slow query, error, AWS ELB, S3, etc. Do we have one function with multiple streams, or a function per stream?
  2. Related to above, how do we then handling the different parsing requirements for each log group. Is this a general Beats issue where we'd setup a single pipeline, and then have multiple processesors for each expected log type?
  3. Am I correct in assuming that even though Kibana and Filebeats has support for Mysql, that because we are getting these logs from functionbeat, we can't use the pre-built dashboards?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.