Logstash error - Unable to retrieve Elasticsearch version

I am trying to trasanfer data from MSSQL to Elasticsearch using Logstash config file. But keep getting error message.


[2025-03-10T00:13:43,950][WARN ][logstash.runner          ] NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[2025-03-10T00:13:43,954][INFO ][logstash.runner          ] Log4j configuration path used is: C:\ElasticStack\logstash-8.17.3\config\log4j2.properties
[2025-03-10T00:13:43,956][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-mswin32]"}
[2025-03-10T00:13:43,957][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-03-10T00:13:43,987][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2025-03-10T00:13:43,987][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2025-03-10T00:13:44,015][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-03-10T00:13:44,937][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-03-10T00:13:45,185][INFO ][org.reflections.Reflections] Reflections took 101 ms to scan 1 urls, producing 152 keys and 530 values
[2025-03-10T00:13:51,889][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-10T00:13:51,899][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:5601"]}
[2025-03-10T00:13:52,024][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@localhost:5601/]}}
[2025-03-10T00:13:52,244][ERROR][logstash.outputs.elasticsearch][main] Unable to retrieve Elasticsearch version {:exception=>LogStash::Json::ParserError, :message=>"Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')\n at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2]"}
[2025-03-10T00:13:52,246][ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Could not connect to a compatible version of Elasticsearch>, :backtrace=>["C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:281:in `block in healthcheck!'", "org/jruby/RubyHash.java:1615:in `each'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:265:in `healthcheck!'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:397:in `update_urls'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:109:in `update_initial_urls'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:103:in `start'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client.rb:371:in `build_pool'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client.rb:64:in `initialize'", "org/jruby/RubyClass.java:922:in `new'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:106:in `create_http_client'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:102:in `build'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:42:in `build_client'", "C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.12-java/lib/logstash/outputs/elasticsearch.rb:301:in `register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:69:in `register'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:245:in `block in register_plugins'", "org/jruby/RubyArray.java:1981:in `each'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:244:in `register_plugins'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:622:in `maybe_setup_out_plugins'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:257:in `start_workers'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:198:in `run'", "C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:150:in `block in start'"], "pipeline.sources"=>["C:/ElasticStack/logstash-8.17.3/config/mssql.conf"], :thread=>"#<Thread:0x9f522aa C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-10T00:13:52,247][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2025-03-10T00:13:52,254][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2025-03-10T00:13:52,262][INFO ][logstash.runner          ] Logstash shut down.
[2025-03-10T00:13:52,267][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:924) ~[jruby.jar:?]
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:883) ~[jruby.jar:?]
        at C_3a_.ElasticStack.logstash_minus_8_dot_17_dot_3.lib.bootstrap.environment.<main>(C:\ElasticStack\logstash-8.17.3\lib\bootstrap\environment.rb:90) ~[?:?]

Below is my config file.


Can someone assist to resolve this issue?

input {
  jdbc {
    jdbc_driver_library => "C:\ElasticStack\logstash-8.17.3\jdbc\mssql-jdbc-12.8.1.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver://DESKTOP-D8HR0LS:1433;databaseName=xxx;integratedSecurity=false;"
    jdbc_user => "sa"
    jdbc_password => "abcd1234"
    statement => "SELECT * from tbl_Uniq_ID"
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:5601"]
    index => "legacy_test"
	user => "elastic"
	password => "xxxx"
  }

  stdout { codec => rubydebug }
}

elasticsearch typically runs on port 9200, kibana is port 5601.

Also check if you are using HTTPS, can try with curl

curl -s -k -u elastic:xxxx  http://localhost:9200
curl -s -k -u elastic:xxxx https://localhost:9200

one of these 2 commands will work, replace xxxx with your password obviously.

Hi, I have updated port. It work but got different error.

[2025-03-10T02:36:55,592][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-10T02:36:55,604][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2025-03-10T02:36:55,703][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2025-03-10T02:36:55,930][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target", :exception=>Manticore::ClientProtocolException, :cause=>#<Java::JavaxNetSsl::SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>}
[2025-03-10T02:36:55,932][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2025-03-10T02:36:55,939][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"legacy_test"}
[2025-03-10T02:36:55,940][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2025-03-10T02:36:55,954][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["C:/ElasticStack/logstash-8.17.3/config/mssql.conf"], :thread=>"#<Thread:0x1a74b500 C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-10T02:36:56,539][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.59}
[2025-03-10T02:36:56,778][INFO ][logstash.inputs.jdbc     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-10T02:36:56,778][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-10T02:36:56,799][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2025-03-10T02:36:59,402][ERROR][logstash.inputs.jdbc     ][main][09faef8ae2a00a7be119499f74f76e52b60fc9db1ced85eec4f0a0eb7ef0c082]
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host DESKTOP-D8HR0LS, port 1433 has failed. Error: "Connection refused: getsockopt. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
        at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(com/microsoft/sqlserver/jdbc/SQLServerException.java:242) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerException.convertConnectExceptionToSQLServerException(com/microsoft/sqlserver/jdbc/SQLServerException.java:308) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SocketFinder.findSocket(com/microsoft/sqlserver/jdbc/IOBuffer.java:2593) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.TDSChannel.open(com/microsoft/sqlserver/jdbc/IOBuffer.java:721) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3768) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3385) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3194) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1971) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(com/microsoft/sqlserver/jdbc/SQLServerDriver.java:1263) ~[mssql-jdbc-12.8.1.jre8.jar:?]
        at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(jdk/internal/reflect/DirectMethodHandleAccessor.java:103) ~[?:?]
        at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:580) ~[?:?]
        at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:330) ~[jruby.jar:?]
        at org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:188) ~[jruby.jar:?]
        at RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/adapters/jdbc.rb:237) ~[?:?]
        at RUBY.new_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:245) ~[?:?]
        at RUBY.make_new(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool.rb:163) ~[?:?]
        at RUBY.assign_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:225) ~[?:?]
        at RUBY.acquire(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:139) ~[?:?]
        at RUBY.hold(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:91) ~[?:?]
        at RUBY.synchronize(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:283) ~[?:?]
        at RUBY.test_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:291) ~[?:?]
        at RUBY.initialize(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/misc.rb:205) ~[?:?]
        at org.jruby.RubyClass.new(org/jruby/RubyClass.java:922) ~[jruby.jar:?]
        at org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen) ~[jruby.jar:?]
        at RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:54) ~[?:?]
        at RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/core.rb:124) ~[?:?]
        at RUBY.jdbc_connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:127) ~[?:?]
        at org.jruby.RubyKernel.loop(org/jruby/RubyKernel.java:1725) ~[jruby.jar:?]
        at org.jruby.RubyKernel$INVOKER$s$0$0$loop.call(org/jruby/RubyKernel$INVOKER$s$0$0$loop.gen) ~[jruby.jar:?]
        at RUBY.jdbc_connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:124) ~[?:?]
        at RUBY.open_jdbc_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:168) ~[?:?]
        at org.jruby.ext.monitor.Monitor.synchronize(org/jruby/ext/monitor/Monitor.java:82) ~[jruby.jar:?]
        at org.jruby.ext.monitor.Monitor$INVOKER$i$0$0$synchronize.call(org/jruby/ext/monitor/Monitor$INVOKER$i$0$0$synchronize.gen) ~[jruby.jar:?]
        at RUBY.open_jdbc_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:164) ~[?:?]
        at RUBY.execute_statement(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:226) ~[?:?]
        at org.jruby.ext.monitor.Monitor.synchronize(org/jruby/ext/monitor/Monitor.java:82) ~[jruby.jar:?]
        at org.jruby.ext.monitor.Monitor$INVOKER$i$0$0$synchronize.call(org/jruby/ext/monitor/Monitor$INVOKER$i$0$0$synchronize.gen) ~[jruby.jar:?]
        at RUBY.execute_statement(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:223) ~[?:?]
        at RUBY.execute_query(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/inputs/jdbc.rb:353) ~[?:?]
        at RUBY.run(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/inputs/jdbc.rb:326) ~[?:?]
        at RUBY.inputworker(C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:420) ~[?:?]
        at RUBY.start_input(C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:411) ~[?:?]
        at org.jruby.RubyProc.call(org/jruby/RubyProc.java:354) ~[jruby.jar:?]
        at java.lang.Thread.run(java/lang/Thread.java:1583) ~[?:?]
[2025-03-10T02:36:59,408][ERROR][logstash.inputs.jdbc     ][main][09faef8ae2a00a7be119499f74f76e52b60fc9db1ced85eec4f0a0eb7ef0c082] Unable to connect to database. Tried 1 times {:message=>"Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host DESKTOP-D8HR0LS, port 1433 has failed. Error: \"Connection refused: getsockopt. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.\".", :exception=>Sequel::DatabaseConnectionError, :cause=>#<Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host DESKTOP-D8HR0LS, port 1433 has failed. Error: "Connection refused: getsockopt. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".>, :backtrace=>["com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(com/microsoft/sqlserver/jdbc/SQLServerException.java:242)", "com.microsoft.sqlserver.jdbc.SQLServerException.convertConnectExceptionToSQLServerException(com/microsoft/sqlserver/jdbc/SQLServerException.java:308)", "com.microsoft.sqlserver.jdbc.SocketFinder.findSocket(com/microsoft/sqlserver/jdbc/IOBuffer.java:2593)", "com.microsoft.sqlserver.jdbc.TDSChannel.open(com/microsoft/sqlserver/jdbc/IOBuffer.java:721)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3768)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.login(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3385)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:3194)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1971)", "com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(com/microsoft/sqlserver/jdbc/SQLServerDriver.java:1263)", "jdk.internal.reflect.DirectMethodHandleAccessor.invoke(jdk/internal/reflect/DirectMethodHandleAccessor.java:103)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:580)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:330)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:188)", "RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/adapters/jdbc.rb:237)", "RUBY.new_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:245)", "RUBY.make_new(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool.rb:163)", "RUBY.assign_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:225)", "RUBY.acquire(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:139)", "RUBY.hold(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/connection_pool/threaded.rb:91)", "RUBY.synchronize(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:283)", "RUBY.test_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:291)", "RUBY.initialize(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/misc.rb:205)", "org.jruby.RubyClass.new(org/jruby/RubyClass.java:922)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/database/connecting.rb:54)", "RUBY.connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/sequel-5.87.0/lib/sequel/core.rb:124)", "RUBY.jdbc_connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:127)", "org.jruby.RubyKernel.loop(org/jruby/RubyKernel.java:1725)", "org.jruby.RubyKernel$INVOKER$s$0$0$loop.call(org/jruby/RubyKernel$INVOKER$s$0$0$loop.gen)", "RUBY.jdbc_connect(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:124)", "RUBY.open_jdbc_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:168)", "org.jruby.ext.monitor.Monitor.synchronize(org/jruby/ext/monitor/Monitor.java:82)", "org.jruby.ext.monitor.Monitor$INVOKER$i$0$0$synchronize.call(org/jruby/ext/monitor/Monitor$INVOKER$i$0$0$synchronize.gen)", "RUBY.open_jdbc_connection(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:164)", "RUBY.execute_statement(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:226)", "org.jruby.ext.monitor.Monitor.synchronize(org/jruby/ext/monitor/Monitor.java:82)", "org.jruby.ext.monitor.Monitor$INVOKER$i$0$0$synchronize.call(org/jruby/ext/monitor/Monitor$INVOKER$i$0$0$synchronize.gen)", "RUBY.execute_statement(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/plugin_mixins/jdbc/jdbc.rb:223)", "RUBY.execute_query(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/inputs/jdbc.rb:353)", "RUBY.run(C:/ElasticStack/logstash-8.17.3/vendor/bundle/jruby/3.1.0/gems/logstash-integration-jdbc-5.5.2/lib/logstash/inputs/jdbc.rb:326)", "RUBY.inputworker(C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:420)", "RUBY.start_input(C:/ElasticStack/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:411)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:354)", "java.lang.Thread.run(java/lang/Thread.java:1583)"]}
[2025-03-10T02:36:59,416][WARN ][logstash.inputs.jdbc     ][main][09faef8ae2a00a7be119499f74f76e52b60fc9db1ced85eec4f0a0eb7ef0c082] Exception when executing JDBC query {:exception=>Sequel::DatabaseConnectionError, :message=>"Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host DESKTOP-D8HR0LS, port 1433 has failed. Error: \"Connection refused: getsockopt. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.\".", :cause=>"#<Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host DESKTOP-D8HR0LS, port 1433 has failed. Error: \"Connection refused: getsockopt. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.\".>"}
[2025-03-10T02:36:59,416][ERROR][logstash.inputs.jdbc     ][main][09faef8ae2a00a7be119499f74f76e52b60fc9db1ced85eec4f0a0eb7ef0c082] Unable to execute statement. Tried 1 times.
[2025-03-10T02:36:59,501][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,501][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,504][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,508][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,508][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,508][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,508][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,508][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,507][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,507][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,506][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker3
[2025-03-10T02:36:59,506][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker11
[2025-03-10T02:36:59,505][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,511][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker7
[2025-03-10T02:36:59,511][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker1
[2025-03-10T02:36:59,510][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker6
[2025-03-10T02:36:59,510][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker4
[2025-03-10T02:36:59,509][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker8
[2025-03-10T02:36:59,509][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker10
[2025-03-10T02:36:59,509][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker2
[2025-03-10T02:36:59,508][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker0
[2025-03-10T02:36:59,512][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker5
[2025-03-10T02:36:59,807][INFO ][logstash.outputs.elasticsearch][main][473c0781e27f55ace2f5178e1a52f9f9d4e53201e7878040c7068cd0a9f02137] Aborting the batch due to shutdown request while waiting for connections to become live
[2025-03-10T02:36:59,808][INFO ][org.logstash.execution.WorkerLoop][main] Received signal to abort processing current batch. Terminating pipeline worker [main]>worker9
[2025-03-10T02:37:00,941][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2025-03-10T02:37:01,320][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2025-03-10T02:37:01,325][INFO ][logstash.runner          ] Logstash shut down.

Hi @kawalkarhemant Welcome to the community...
Glad you're making progress. Seems like a pretty direct error here

Pretty clear error... Can't connect to the database with the current address and port and driver...

@kawalkarhemant probably also want to check the ssl setup is correct (the better solution) or maybe add

ssl_verification_mode => none

or

ssl_certificate_verification => false

(depending on version, edited to correct typo)

to the elasticsearch output in logstash.

See:

and

Issue still not resolved. getting following error -

 New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2025-03-10T23:40:41,003][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2025-03-10T23:40:41,171][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target", :exception=>Manticore::ClientProtocolException, :cause=>#<Java::JavaxNetSsl::SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>}
[2025-03-10T23:40:41,173][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2025-03-10T23:40:41,177][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"legacy_test"}
[2025-03-10T23:40:41,178][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2025-03-10T23:40:41,190][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/config/mssql.conf"], :thread=>"#<Thread:0x39f1853f C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-10T23:40:41,692][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.5}
[2025-03-10T23:40:41,925][INFO ][logstash.inputs.jdbc     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-10T23:40:41,926][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}

Config file -


input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlserver://DESKTOP-xxx:1433;databaseName=xxx;encrypt=false;trustServerCertificate=true"
    jdbc_user => "sa"
    jdbc_password => "abcd1234"
    jdbc_driver_library => "C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\logstash-core\lib\jars\mssql-jdbc-12.8.1.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    statement => "SELECT * FROM tbl_Uniq_ID"
  }
}

output {
  elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "xxxx"
  }
}

You didn't think my suggestion would be helpful? Maybe try it?

As RainTown, you have to set cert fields in order to establish SSL session.

output {
  elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "xxxx"
 		ssl_certificate_authorities => ["C:/ElasticStack/logstash-8.17.3/config/ca-cert.pem"]
 		ssl_enabled  => true
 		ssl_verification_mode  => "none" # or set to "full" for mutual authentication
 		ssl_certificate => "C:/ElasticStack/logstash-8.17.3/config/cert.pem"
 		ssl_key => "C:/ElasticStack/logstash-8.17.3/config/cert-key.pem"
  }
}

The last two are optional.

Hi, appreciate for reply. I tried your solution but did not solve the issue.

I have MSSQL develop edition. It keep asking me provide authenticate of certificate as per given log file.

From replies, I am not completely sure you have understood the issue. So lets check.

At least based on whats been shared, your logstash configuration can pull data from your database, but it cannot store it into your elasticsearch cluster (which is likely a single node?) because of the SSL cert errors. That is my understanding, is that all correct?

If thats all correct, the way to get it working is to either point logstash to valid certificates, or tell logstash to not verify the certificate. This is done within the logstash configuration as has been shared. You would need to adjust the specific paths appropriately and check that those files are valid and readable by the elasticsearch process.

Is the host or virtual-machine with logstash+elasticsearch a Linux or windows host ? Did you generate certificates when setting up elasticsearch, or did elasticsearch just generate its own?

If by "it" you mean "MSSQL develop edition", then you will need to find ways to do same as above with that tool/set of tools.

Hi, I have build the MSSQL certificate and provided path but still getting following error -

[2025-03-16T23:02:53,258][WARN ][logstash.runner          ] NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[2025-03-16T23:02:53,258][INFO ][logstash.runner          ] Log4j configuration path used is: C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\log4j2.properties
[2025-03-16T23:02:53,258][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-mswin32]"}
[2025-03-16T23:02:53,258][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-03-16T23:02:53,290][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2025-03-16T23:02:53,290][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2025-03-16T23:02:53,321][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-03-16T23:02:54,175][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/-C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/config/mssql.conf"}
[2025-03-16T23:02:54,177][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
[2025-03-16T23:02:54,244][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-03-16T23:02:54,252][INFO ][logstash.runner          ] Logstash shut down.
[2025-03-16T23:02:54,256][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:924) ~[jruby.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:883) ~[jruby.jar:?]
	at C_3a_.ElasticStack.logstash_minus_8_dot_17_dot_3_minus_windows_minus_x86_64.logstash_minus_8_dot_17_dot_3.lib.bootstrap.environment.<main>(C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\lib\bootstrap\environment.rb:90) ~[?:?]

Updated Config file -

input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlserver://DESKTOP-D8HR0LS:1433;databaseName=XXXX;encrypt=false;trustServerCertificate=true"
    jdbc_user => "sa"
    jdbc_password => "XXXX"
    jdbc_driver_library => "C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\logstash-core\lib\jars\mssql-jdbc-12.8.1.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    statement => "SELECT * FROM tbl_Uniq_ID"
  }
}

output {
  elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "XXX"
		ssl_certificate_authorities => ["C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\SQL_Cert.cer"]
# 		ssl_enabled  => false
 		ssl_verification_mode  => "full"
  }
}

That path looks wrong to me. How are you starting logstash?

I am running logstash using following command -
logstash -f ..\config\mssql.conf

Is there any other way to run?

I cannot imagine how expand_path could turn that into the path in the error message.

Hi, getting following error -

C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\bin>logstash -f C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\mssql.conf
"Using bundled JDK: C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\jdk\bin\java.exe"
Sending Logstash logs to C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/logs which is now configured via log4j2.properties
[2025-03-17T13:27:10,030][WARN ][logstash.runner          ] NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[2025-03-17T13:27:10,030][INFO ][logstash.runner          ] Log4j configuration path used is: C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\log4j2.properties
[2025-03-17T13:27:10,030][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-mswin32]"}
[2025-03-17T13:27:10,030][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-03-17T13:27:10,061][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2025-03-17T13:27:10,061][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2025-03-17T13:27:10,092][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-03-17T13:27:11,750][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-03-17T13:27:11,930][INFO ][org.reflections.Reflections] Reflections took 85 ms to scan 1 urls, producing 152 keys and 530 values
[2025-03-17T13:27:17,741][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-17T13:27:17,750][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2025-03-17T13:27:17,863][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2025-03-17T13:27:18,029][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target", :exception=>Manticore::ClientProtocolException, :cause=>#<Java::JavaxNetSsl::SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>}
[2025-03-17T13:27:18,031][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2025-03-17T13:27:18,035][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"legacy_test"}
[2025-03-17T13:27:18,036][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2025-03-17T13:27:18,046][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/config/mssql.conf"], :thread=>"#<Thread:0x56f75b2a C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-17T13:27:18,571][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.53}
[2025-03-17T13:27:19,099][INFO ][logstash.inputs.jdbc     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-17T13:27:19,100][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-17T13:27:19,111][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2025-03-17T13:27:19,343][INFO ][logstash.inputs.jdbc     ][main][51f36be7356008d684858dda9176542ff1f52380f39c30219fc7011ce8929ea2] (0.028035s) SELECT * FROM tbl_Uniq_ID
[2025-03-17T13:27:19,446][INFO ][logstash.outputs.elasticsearch][main][8ffd74c04d44c0613bc4f11baf81024ade6daef6a4fc42de0dd25d18f3387f60] Aborting the batch due to shutdown request while waiting for connections to become live

To Check status of Elastic Server getting following response-


C:\>curl -X GET -u elastic:XXXX -k https://localhost:9200/_cluster/health?pretty
{
  "cluster_name" : "elasticsearch",
  "status" : "green",
  "timed_out" : false,
  "number_of_nodes" : 2,
  "number_of_data_nodes" : 2,
  "active_primary_shards" : 33,
  "active_shards" : 66,
  "relocating_shards" : 0,
  "initializing_shards" : 0,
  "unassigned_shards" : 0,
  "unassigned_primary_shards" : 0,
  "delayed_unassigned_shards" : 0,
  "number_of_pending_tasks" : 0,
  "number_of_in_flight_fetch" : 0,
  "task_max_waiting_in_queue_millis" : 0,
  "active_shards_percent_as_number" : 100.0

Your curl command has -k which essentially means don't check anything in the SSL certificate chain, aka "check nothing".

Your most-recently shared logstash config had ssl_verification_mode => "full" which is the opposite of "check nothing", effectively "check everything".

If this is just a POC, test project, something private then make life easier for yourself and just go with "check nothing" options for now. For logstash that ssl_verification_mode => none in the elasticsearch output section.

Hi, I have set it to 'none' but got following error -

[2025-03-17T16:44:05,688][WARN ][logstash.runner          ] NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[2025-03-17T16:44:05,704][INFO ][logstash.runner          ] Log4j configuration path used is: C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\log4j2.properties
[2025-03-17T16:44:05,704][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-mswin32]"}
[2025-03-17T16:44:05,704][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-03-17T16:44:05,735][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2025-03-17T16:44:05,735][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2025-03-17T16:44:05,766][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-03-17T16:44:06,693][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-03-17T16:44:06,935][INFO ][org.reflections.Reflections] Reflections took 88 ms to scan 1 urls, producing 152 keys and 530 values
[2025-03-17T16:44:08,528][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-17T16:44:08,538][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2025-03-17T16:44:08,541][WARN ][logstash.outputs.elasticsearch][main] You have enabled encryption but DISABLED certificate verification, to make sure your data is secure set `ssl_verification_mode => full`
[2025-03-17T16:44:08,639][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2025-03-17T16:44:08,839][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@localhost:9200/"}
[2025-03-17T16:44:08,840][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.17.3) {:es_version=>8}
[2025-03-17T16:44:08,850][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"legacy_test"}
[2025-03-17T16:44:08,850][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2025-03-17T16:44:08,859][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2025-03-17T16:44:08,866][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/config/mssql.conf"], :thread=>"#<Thread:0x22e664ac C:/ElasticStack/logstash-8.17.3-windows-x86_64/logstash-8.17.3/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-17T16:44:09,365][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.5}
[2025-03-17T16:44:09,593][INFO ][logstash.inputs.jdbc     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-17T16:44:09,594][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-17T16:44:09,605][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2025-03-17T16:44:09,768][INFO ][logstash.inputs.jdbc     ][main][707798a09c69ffed54618b00dbd3f661692a3224acc8b95c9cf4659d0266a736] (0.025127s) SELECT * FROM tbl_Uniq_ID
[2025-03-17T16:44:10,850][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2025-03-17T16:44:11,115][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2025-03-17T16:44:11,124][INFO ][logstash.runner          ] Logstash shut down.

Config file as per below -

input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlserver://DESKTOP-D8HR0LS:1433;databaseName=XXXX;encrypt=false;trustServerCertificate=true"
    jdbc_user => "sa"
    jdbc_password => "XXX"
    jdbc_driver_library => "C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\logstash-core\lib\jars\mssql-jdbc-12.8.1.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    statement => "SELECT * FROM tbl_Uniq_ID"
  }
}

output {
  elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "XXXX"
		ssl_certificate_authorities => ["C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\SQL_Cert.cer"]
# 		ssl_enabled  => false
 		ssl_verification_mode  => "none"
  }
}

There is one filed I am trying to import from MSSQL to Elasticsearch, but still getting issue.

That looks like it executed normally to me. The SQL statement was executed and there is no schedule on the jdbc input, so it shut down after the query was completed and the result set was flushed to the pipeline.

2 Likes

You can additional outputs (one or both of the 2 below) to see better what is happening

output {  
 stdout { codec => rubydebug }
 file { path => /path/to/some/output-file, codec => rubydebug }
 elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "XXXX"
		ssl_certificate_authorities => ["C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\SQL_Cert.cer"]
# 		ssl_enabled  => false
 		ssl_verification_mode  => "none"
  }
}
1 Like

First of all thanks for continuous reply :slight_smile:
Finally Solved using below line -
stdout { codec => rubydebug }


input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlserver://DESKTOP-D8HR0LS:1433;databaseName=XXX;encrypt=false;trustServerCertificate=true"
    jdbc_user => "sa"
    jdbc_password => "XXX"
    jdbc_driver_library => "C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\logstash-core\lib\jars\mssql-jdbc-12.8.1.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    statement => "SELECT * FROM tbl_Uniq_ID"
  }
}

output {
  stdout { codec => rubydebug }
#  file { path => /path/to/some/output-file.csv, codec => rubydebug }
  elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "legacy_test"
		user => "elastic"
		password => "XXXXX"
		ssl_certificate_authorities => ["C:\ElasticStack\logstash-8.17.3-windows-x86_64\logstash-8.17.3\config\SQL_Cert.cer"]
# 		ssl_enabled  => false
 		ssl_verification_mode  => "none"
  }
}