I am using Logstash to sync 4 tables from Mysql to Elasticsearch. My problem is how to scale this to 200 hundred customers. Each customer is in a different database schema and possibly other server too. Every customer should be in a different index on Elastic. What would be the best approach? Should I run logstash in every server and sync all databases from it with the four pipelines getting the results from multiple jdbc inputs and multiple indexes outputs? Or should I configure it with docker and kubernetes so each docker instance handle one input and output?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.