JDBC connection remain open after data insertion in Elastic search

Hello,

We are inserting data in Elastic search from multiple postgresql tables using Logstash. Here is a demo view of one of the config file for doing it:
input {
jdbc {
jdbc_connection_string => "Connection detail"
jdbc_user => "xyz"
jdbc_password => "xyz"
jdbc_validate_connection => true
#jdbc_driver_library => "libaray path"
jdbc_driver_class => "org.postgresql.Driver"
statement => "select * from schema.sdtm_ex"
}
}
output {
#stdout { codec => json_lines }
elasticsearch {
index => "sdtm_ex"
document_type => "ex"
}
}

Now, after inserting data in the elasticsearch the connection is still showing open in the DB. Is there anyway to to close the JDBC connection once the insertion is complete. Thanks.

Edit 1: This issue happens only when we schedule add scheduler for this incidence. If we do it for once in a single table it works fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.