Only import new or updated rows using JDBC

Right now I've got Logstash importing a miniature version of my MSSQL database, using the JDBC plugin. I've got each JDBC input scheduled to run every minute to update Elasticsearch. To update, I'm currently just re-importing every single table and row in my database, and adding all rows to Elasticsearch. When I begin using the full database though, this will be very inefficient as it will take more than a minute to go through the entire database. Is there another way to keep Elasticsearch in sync with my database? I've tried to use the 'sql_last_value' parameter to only import new rows into the database, but this will only work when the 'id' for my database table is a number, and each new entry in the table has a greater number than the last. Some tables in the database have an 'id' which can be completely random (ie. "43f4-f43ef-e44454r"), which cannot be used with 'sql_last_value' as they are impossible to compare. I'm also unable modify the actual database at all, which cuts out a lot of my potential solutions. I feel as if i am out of options here, so can anyone suggest anything that I can try?

Is there any other column that you can use? Something that indicates a last modification date? If not you seem to be out of luck. Logstash can't detect what's new without some help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.