Logstash JDBC input filter - Pagination

Hi,
I need a way to replicate what paging does ,
I'll simplify and explain only the problem part that we are struck on

consider I have a table with ID's [ 1 to 2000 ] and [4000 - 1M] ( ID's from 2001 to 3999 are not present)
and I have configured JDBC input as

......
jdbc {
    select [columns] from [table] 
    where id > :sql_last_value and id < :sql_last_value + :page_size
}
use_column_value => true
tracking_column => "id"
parameters => {  "page_size" => "1000" }
tracking_column_type => "numeric"
schedule => "*/1 * * * *"
....

For the first two iterations, everything works fine and the :sql_last_value would be 2000.
But for the third iteration logstash looks for id's from 2001 to 3000 which are not present and it keeps on looking for same values in subsequent iterations.

I want to know If there is a way for me to achieve something like

......
jdbc {
    select [columns] from [table] 
    where id > ( :page_size * :current_page )   and 
          id <= : ( :page_size *  ( :current_page+1 ) ) 
}
use_column_value => true
tracking_column => "id"
parameters => {  "page_size" => "1000", current_page => "0" }
tracking_column_type => "numeric"
schedule => "*/1 * * * *"
   last_run_metadata_path => "file_path"
filter {
........
}

output{
  elasticsearch {
  .....
}
  
}
....

and increment 'current_page' after every iteration of logstash execution ?

I know I can set jdbc_paging_enabled => true and jdbc_page_size => 1000 But I specifically need a way to achieve the above mentioned scenario.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.