Jdbc input - how to trigger immediately after pipeline is started / reloaded?

Hi,

I am currently inplementing a logstash pipeline which is using jdbc input plugin.
Is there any way that logstash starts the jdbc input query immediately after the pipeline has been started or reloaded?

When I do a change of the pipeline config during developement, it is just so inefficient if you always have to wait until the next minute change.

Thanks, Andreas

What does your config look like? If you don't provide a schedule it should start as soon as the process spins up...

I defined it by once a minute:

	jdbc_driver_library => "${ORA_JDBC_LIB_PATH}"
	jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
	
	jdbc_connection_string => "${JDBC_AMES_CONNECT_STRING}"
	jdbc_user => "${JDBC_AMES_USER}"
	jdbc_password => "${JDBC_AMES_PW}"
	add_field => { "stage" => "${STAGE}" }
	add_field => { "hostName" => "${DB_HOSTNAME}" }
		
	jdbc_validate_connection => true
	statement => 
		"
			here comes the statement

		"
			
	schedule => "* * * * *"

If I do not provide a schedule, does it run then only once?

I have not messed with it in a while but I believe you need to specify:

last_run_metadata_path => "/etc/logstash/jdbc/.logstash_jdbc_last_run"

Don't worry about the path I have, you can define your own just make sure Logstash can write to the file. With this in place and no schedule the pipeline will run until the query is finished, then terminate...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.