Implement concurrency while inserting data to ES using logstash


I am using below config file for logstash. I have used schedule= 1 min as jdbc scheduler which keep on inserting data to ES in every 1 min. But when number of records are more i am not able to identify weather it is following transaction or not . it should follow locking of resource data. How to implement locking kind of thing it in this case.

input {
jdbc {
jdbc_driver_library => "demo\sqljdbc_6.4\enu\mssql-jdbc-6.4.0.jre8.jar"
jdbc_driver_class => ""
jdbc_connection_string => "jdbc:sqlserver://L-ff;databaseName=Cultivation;integratedSecurity=true;"
jdbc_user => "manjur.gani"
schedule => "* * * * *"
tracking_column => "sdate"
last_run_metadata_path => "demo.logstash_jdbc_last_run"
tracking_column_type => "timestamp"
use_column_value => true
type =>farm


I'm not sure what you're asking. What transaction? What locking?

But when number of records are more

More than what?

okey. lets say schedule time is 1 min. If my first run took more than 1 min because of heavy data and mean while 2nd run starts as we have given schedule time as 1 the question is how we can restrict this behavior.

Okay. Queries will never run concurrently, i.e. if the first query takes longer than one minute to process the second query won't run until the first one's done.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.