Hi,
I am using JDBC input in the logstash. The query used in it selects data from a couple of tables. With the initial run, data is extracted using the query and stored in elasticsearch. But if new records are being inserted into the tables involved in the query, is is possible to get that data immediately into elasticsearch?
jdbc {
type => "databaseData"
jdbc_driver_library => "C:\tools\ELK5\logstash-5.2.2\jdbc-drivers\db2jcc4.jar"
jdbc_driver_class => "com.ibm.db2.jcc.DB2Driver"
jdbc_connection_string => "jdbc:db2://localhost:50000/UBDB"
jdbc_user => "dbuser"
jdbc_password => "dbpass"
statement_filepath => "C:\tools\ELK5\logstash-5.2.2\query\transfer.sql"
jdbc_fetch_size => 5
}
Thanks,
Gajanan