Tail database table entries

Hi,

I am using JDBC input in the logstash. The query used in it selects data from a couple of tables. With the initial run, data is extracted using the query and stored in elasticsearch. But if new records are being inserted into the tables involved in the query, is is possible to get that data immediately into elasticsearch?

jdbc {
type => "databaseData"
jdbc_driver_library => "C:\tools\ELK5\logstash-5.2.2\jdbc-drivers\db2jcc4.jar"
jdbc_driver_class => "com.ibm.db2.jcc.DB2Driver"
jdbc_connection_string => "jdbc:db2://localhost:50000/UBDB"
jdbc_user => "dbuser"
jdbc_password => "dbpass"
statement_filepath => "C:\tools\ELK5\logstash-5.2.2\query\transfer.sql"
jdbc_fetch_size => 5
}

Thanks,
Gajanan

You can schedule the jdbc input to run at arbitrary intervals. The plugin provides a sql_last_value parameter that you can use in your query to only select rows that are more recent than the last row that was fetched last time. This requires that you have a "last modified" column or similar so that it's easy to find the rows that should be fed into ES. This is described in the documentation.

I will try that. Thank you so much!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.