Hello
I have ELK 6.3.2v.
Logstash-input-jdbc : v4.3.9
In my current configuration for the JDBC pipeline there are about 15-20 inputs with JDBC queries. Such inputs scheduled for run every minute -> "* * * * *" or every day.
For outputs I use elasticsearch-output.
I have a few questions about input works:
-
Why is the data in the file with the path last_run_metadata_path written only after the JDBC request is full completed?
-
What happens if the executing JDBC query fails with an error on some moment? Data in the last_run_metadata_path will be recorded?
-
Why the data in the file last_run_metadata_path is not recorded for each page of the read request if I set the jdbc_paging enabled.
-
What happens if the input does not have time to read all the data until the next scheduling execution ?
p.s. I experimented with jdbc_fetch_size, jdbc_page_size but they are not in the config example below
Example of one of the inputs :
input { jdbc { id => "input_jdbc_order" jdbc_driver_library => "${LOGSTASH_JDBC_DRIVER:ojdbc6.jar}" jdbc_driver_class => "Java::oracle.jdbc.OracleDriver" jdbc_connection_string => "${LOGSTASH_JDBC_CONNECTION}" jdbc_user => "${LOGSTASH_JDBC_USER}" jdbc_password => "${LOGSTASH_JDBC_PASSWORD}" use_column_value => true tracking_column => tracking_number last_run_metadata_path => "${LOGSTASH_METADATA_PATH}/main/order" statement_filepath => "${LOGSTASH_JDBC_SQL_SCRIPTS_PATH}/main/order.sql" schedule => "* * * * *" #run every minute add_field => { "[@metadata][type]" => "order" } } }
Regards