Implement concurrency while inserting data to ES using jdbc conection in logstash config file

2m
Hi,

I am using below config file for logstash. I have used schedule= 1 min as jdbc scheduler which keep on inserting data to ES in every 1 min. But when number of records are more i am not able to identify weather it is following transaction or not . it should follow locking of resource data. How to implement locking kind of thing it in this case.

input {
jdbc {
jdbc_driver_library => "demo\sqljdbc_6.4\enu\mssql-jdbc-6.4.0.jre8.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://L-ff;databaseName=Cultivation;integratedSecurity=true;"
jdbc_user => "manjur.gani"
statement_filepath=>"demo:\SQLQuery12.sql"
schedule => "* * * * *"
tracking_column => "sdate"
last_run_metadata_path => "demo.logstash_jdbc_last_run"
tracking_column_type => "timestamp"
use_column_value => true
type =>farm
}

}
output
{
//code
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.