SQL Server data to Elasticsearch using LogStash

Hi

input {
jdbc {
# SqlServer jdbc connection string to your database, productdb
# "jdbc:sqlserver://HostName\instanceName;database=DBName;user=UserName;password=Password"
jdbc_connection_string => "jdbc:sqlserver://XXXX;database=XXXX;user=XXXX;password=XXXXX;"
# The user you want to execute your statement as
jdbc_user => nil
# The path to your downloaded jdbc driver
jdbc_driver_library => "C:/Program Files/Microsoft JDBC Driver 4.0 for SQL Server/sqljdbc_4.2/enu/jre8/sqljdbc42.jar"
# The name of the driver class for SqlServer
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
# Query for testing purpose
statement => "SELECT * from Country"
}
}
output {
elasticsearch {
hosts => ["https://elastic.XXXXXX:9200"]
user => "XXXXX"
password => "XXXX"
index => "cs_country"
}
stdout { }
}

Getting Error
[2021-09-10T18:14:13,168][ERROR][logstash.javapipeline ][main][1b260913fb906cb14d0e10b411e900e700cc941bab9ac01dcc7d8cd125033610] A plugin had an unrecoverable error. Will restart this plugin.

C:/Program Files/Elastic/logstash-7.13.2/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.7/lib/logstash/plugin_mixins/jdbc/common.rb:42:in load_driver_jars' C:/Program Files/Elastic/logstash-7.13.2/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.7/lib/logstash/plugin_mixins/jdbc/common.rb:25:in load_driver'
C:/Program Files/Elastic/logstash-7.13.2/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.7/lib/logstash/inputs/jdbc.rb:275:in run' C:/Program Files/Elastic/logstash-7.13.2/logstash-core/lib/logstash/java_pipeline.rb:405:in inputworker'
C:/Program Files/Elastic/logstash-7.13.2/logstash-core/lib/logstash/java_pipeline.rb:396:in `block in start_input'

I would say that the problem is with loading your JDBC driver. Are you sure that the provided driver path is correct?

@Andy0708
The Driver path is correct.

Note the same code is working file before the basic security of ELK

After pasting the fresh copy of JDK in logstash folder issue is fixed