What is the best way to config multiple sql tables in logstash

Hi,

I've multiple tables in SQL server that needs to be imported to elasticsearch using logstash. So far, I'm doing single conf for single table but it's getting hard to maintain several config files.

Is it better to have multiple events and set all tables in a single logstash configuration file.But, I'm concerned if it will put more load and slow down the import process. Can someone explain how logstash behaves as i'm planning to implement this in production.

Thanks in advance

Having multiple files is no different than one big one, unless you are running multiple LS instances.
What it does it concatenate all of individual files into a single one at runtime.

1 Like

So basically, for each table in any RDBMS, a single configuration file will have to be created, right ? Is there a more efficient way of doing this ?, like writing in the statement something like:

Select table_name from information_schema.tables

and somehow get the table_names brought up by the query and run the logstash configuration for each table_name ... ?