Multiple statements in logstash JDBC Input

Hello,

I have configured jdbc input plugin with 3 statements as below ( providing just an example )

input {
  jdbc {
    statement => "select from DATABASE1 where TRANSACTION_DATE > :sql_last_value"
    use_column_value => true
    tracking_column => "TRANSACTION_DATE"
    tracking_column_type => "timestamp"
 }
  jdbc {
    statement => "select from DATABASE1 where TRANSACTION_DATE > :sql_last_value"
    use_column_value => true
    tracking_column => "TRANSACTION_DATE"
    tracking_column_type => "timestamp"
}

  jdbc {
    statement => "select from DATABASE2 where TRANSACTION_DATE > :sql_last_value"
    use_column_value => true
    tracking_column => "TRANSACTION_DATE"
    tracking_column_type => "timestamp"
}

}

The 1st two statements are from Database1 and the 3rd statement is from a different Database.

The results for the 1st two statements are always correct.

However the output for the third query seems to be wrong....We are getting data even from the past even though i have provided tracking column and looking for data only for the last 1 day.

Is the above config correct ?

Do I need to make any changes to run the sql statements on different databases ?

Any suggestions ?

Hello,

By default, Logstash JDBC input stores the last value in the path $HOME/.logstash_jdbc_last_run - which is a simple text file. So the sql_last_value from the first SQL is stored and used by the second SQL and so on. I don't know why you only have problems with the last SQL but the solution is to set last_run_metadata_path:

  • Create a directory like $HOME/.logstash_jdbc_last_run
  • Configure last_run_metadata_path for each SQL to point to a separate file within this directory

Best regards
Wolfram

1 Like

Hello,

Yes you are correct....I have re-examined the output for the other SQL's as well and it looks like we the issue for all the SQL's.

Below is the current output of jdbc last run :

cat .logstash_jdbc_last_run
--- 1970-01-01 00:00:00.000000000 Z

Below is my JDBC input

jdbc {
jdbc_validate_connection => true
jdbc_connection_string => "jdbc:oracle:thin:@//"
jdbc_user => "
"
jdbc_password => "
*"
jdbc_driver_library => "/elk/dependencies/ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
schedule => "
/20 * * * *"
statement => "select CALLNUMBER, STATUS, SOURCESYSTEM, DESCRIPTION, INSTANCE_NUM, TRANSACTION_DATE, CUSTOMERID, DESTINATIONSYSTEM, CUSTOMERPROBLEMNUMBER, CUSTEQUIPMENTNUMBER from TABLE_NAME where status in ('SIEBELERROR','ERROR', 'EXHAUSTED') and TRANSACTION_DATE > trunc(sysdate) and TRANSACTION_DATE > :sql_last_value order by TRANSACTION_DATE desc"
use_column_value => true
tracking_column => "TRANSACTION_DATE"
tracking_column_type => "timestamp"
type => "IN_CTL"
}

Could you please let me know what am I missing here and why is the .logstash_jdbc_last_run set to 1970 which is the default value.

The requirement is to get the output of only the recent data ( which got generated after the previous run of the query )

Hi,

I think I know what the problem is as I fell over the same problem when I started using JDBC input: Your tracking_column is in uppercase but by default the JDBC input converts all column names to lowercase.

Can you please try to either change the tracking_column to lowercase or try setting lowercase_column_names to false?

Best regards
Wolfram

1 Like

Thanks...Will try this and let you know...

However, could you please let me know what difference does it make even if the tracking column is changed to lower case ??

The difference is that the SQL currently returns columns like transaction_date, callnumber, status, ... the jdbc input then gets the value of the column TRANSACTION_DATE which cannot be found as the column was converted to lowercase => therefore, JDBC input stores the default value of a date which is 1970-01-01 00:00:00.000000000 Z.
After converting the value of tracking_column to lowercase the JDBc input will correctly read the last value of transaction_date and store e.g. 2020-09-23 09:20:13.000000000 Z

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.