Logstash jdbc ignore default last run folder

I am using the logstash jdbc plugin and everything is working fine, but it gives out an error every time the sql query is executed.
The error is:


This causes no problem what so ever. It just floods the logs and it's annoying.
I have set in my conf file the last_run_metadata_path and I don't know where it's taking that path.

input {
	jdbc {
		jdbc_connection_string => "jdbc:${PG_URI}${DB_NAME}?user=${DB_USER}"
		jdbc_user => "${DB_USER}"
		jdbc_driver_library => "${JDBC_DRIVER_DIR}/postgresql-42.1.4.jar"
		jdbc_driver_class => "org.postgresql.Driver"
		schedule => "* * * * *"
		# schedule => "0 0 * * *" #//Runs the scheduler every 12:00AM
		statement => "SELECT * FROM items where created_at > :sql_last_value"
		use_column_value => true
		tracking_column => created_at
		last_run_metadata_path => "${LAST_RUN_DIR}/.000_${DB_NAME}_jdbc_last_run"
		type => "created-items"
	}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.