Hi,
We're planning to introduce Elasticsearch in our product, to provide better performance while searching data. Currently we have around 200 db instances with the same schema, containing data about tenants in our app.
We want to combine all DBs in one ES instance in one index. We want to create an indexing solution, that would be able to run during the night to index those dbs and we're not sure about recommended approach. This would be a solution for first indexing current DBs, then we want to use kafka events with single logstash to update index.
Our first guess was to use multiple logstashes in docker container, using chef or puppet to deploy them to aws. Each would read from one of the DBs, pushing data to ES. We are not sure, if this would be a good solution
Second idea was to use multiple pipelines targeting our ES, but we doubt it would have better performance.
Are there any good practice for this kind of problem?
1 Like
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.