Multiple sql table to one elastic index

I have been using logstash to index data in elastic from mysql via jdbc plugin. I was using join query to pull the data but my usecase increased to check for insert and updates. So, i used tracking columns and :sql_last_value but I have updation columns in both the tabels and want to check for updation or insertion in any one of them. However, i have only been able to made the updation check on one updation column of one table only. How, can i make a check on both updation table i.e. how to give two(multi) tracking columns to logstash.

I do not think you can. You need to write queries that generate documents with a single tracking column. If you need to track changes across multiple tables that form part of a join you may need to select the maximum of a number of tracking columns into a single tracking column to compare with. This will naturally result in a more complex SQL query, but should be possible.

If you can not create a single query the easiest way may be to create a custom script and use that instead of Logstash.

Thanks, can you hint me with something on how can i achieve combine multiple columns because when I concat i cannot use it in the same query and also how can i make :last_sql_value to map on that field because tracking column only takes [integer, timestamp].

Maybe something like this would work? That may not be the way to do it but it has been quite some time since I write more advanced SQL queries.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.