How to configure logstash to read only new entries from database but not using "sql_last_start"

Is it possible to query database like: "SELECT * FROM TABLE_NAME where id > : last_saved_id"
and not by using built-in parameter "sql_last_start" e.g. "SELECT * FROM TABLE_NAME where timestamp > : sql_last_start".
I don't have timestamp in my table and I want to schedule logstash to have jdbc input and output in elasticsearch.
Now, I dont want to query whole table, I want to query only new entries from database but I don't know how and where to save the "id" from previous input. This id will be used in next scheduled time. Let's say, table has 1000 entries, and all of them are read by logstash. Meanwhile, there are 100 new entries. So, when next scheduled logstash is triggered to fetch input data from database, only this 100 new entries should be returned in result set, and passed to logstash output.
Any idea?

recording max and min values of last seen column values is something we may add in the future. This would result in the ability to run such a query where you check against something like :latest_max_<column_name>. One may only benefit from this if their ID structure is ascending in value for every new entry and not a random string. Please file an issue in the github page (https://github.com/logstash-plugins/logstash-input-jdbc) for this and we would be happy to start to tackle this feature!

Was this issue ever submitted to github? If not, I'll submit.

I have a very similar issue, where my data is late arriving but the timestamp is the timestamp of the event, not of being written (reporting nodes ship data every hour, so on the hour, I get the last hour of data). This feature would be incredibly helpful for me.

This is now possible with the latest plugin with support for tracking_column