I'm in the the middle of testing how well deploying a MariaDB Galera cluster per app works in a Docker Swarm stack.
The idea being that we can tweak each Galera cluster to fit each individual apps needs, have them be highly available, and just to keep things separated from each other.
As part of my testing I'm actively trying to break things. Which means I really need solid monitoring in place to demonstrate that our clusters are working the way we want.
When I started digging into using the ELK Stack for this, I got stuck on how to get the data from each Galera Cluster into Elasticsearch.
For my setup, I have a test Elasticsearch 3 node cluster and Fleet managed Elastic Agents. My Docker Swarm cluster has 3 managers, and 3 workers. Each have Agent installed directly on the host. NOT in containers.
I can monitor the containers themselves. I can grab the logs from the containers. But those logs don't seem to end up visible on the built in MySQL dashboards. And I haven't been able to tell if my custom pipeline rerouting logs from the docker pipeline into the mysql pipeline has worked since only error logs are actually grabbed...
Then there's metrics. Docker metrics are not MySQL metrics, so I can't reroute them like I'm trying to do with the logs. Which leaves configuring the MySQL integration. But how exactly should I configure it? Do I have to open up a port per container, then point the config at all those ports? That seems a bit painful.
What about sidecar containers? Could I add Filebeat and Metricbeat to each cluster? But doesn't that mean my logs will all end up in different indices from the rest of my Agent managed logs? And would the logs and metrics go through the right pipelines to end up where the built in dashboards and metrics would be useful?
Anyway, it's the end of my Friday, so I'm going to stop rambling.
Anyone have any suggestions on how I should approach this?