Logstash conf error - NoSuchMethodError for PipelineAction::Create

[2020-02-04T12:55:48,340][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
   [2020-02-04T12:55:48,341][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
   [2020-02-04T12:55:48,341][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
   [2020-02-04T12:55:48,341][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
   [2020-02-04T12:55:48,342][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
   [2020-02-04T12:55:48,342][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
   [2020-02-04T12:55:48,342][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
   [2020-02-04T12:55:48,343][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
   [2020-02-04T12:55:48,343][DEBUG][logstash.runner          ] node.uuid: ""
   [2020-02-04T12:55:48,343][DEBUG][logstash.runner          ]
    --------------- Logstash Settings -------------------
   [2020-02-04T12:55:48,384][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
   [2020-02-04T12:55:48,400][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
   [2020-02-04T12:55:48,442][DEBUG][logstash.agent           ] Setting up metric collection
   [2020-02-04T12:55:48,509][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
   [2020-02-04T12:55:48,519][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
   [2020-02-04T12:55:48,668][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
   [2020-02-04T12:55:48,793][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
   [2020-02-04T12:55:48,799][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
   [2020-02-04T12:55:48,813][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
   [2020-02-04T12:55:48,823][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
   [2020-02-04T12:55:48,876][DEBUG][logstash.agent           ] Starting agent
   [2020-02-04T12:55:48,929][DEBUG][logstash.config.source.local.configpathloader]
   Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["e:/logstash-7.1.1/bin/benchmark.sh", "e:/logstash-7.1.1/bin/cpdump", "e:/logstash-7.1.1/bin/dependencies-report", "e:/logstash-7.1.1/bin/ingest-convert.sh", "e:/logstash-7.1.1/bin/logstash", "e:/logstash-7.1.1/bin/logstash-keystore", "e:/logstash-7.1.1/bin/logstash-keystore.bat", "e:/logstash-7.1.1/bin/logstash-plugin", "e:/logstash-7.1.1/bin/logstash-plugin.bat", "e:/logstash-7.1.1/bin/logstash.bat", "e:/logstash-7.1.1/bin/logstash.lib.sh", "e:/logstash-7.1.1/bin/pqcheck", "e:/logstash-7.1.1/bin/pqrepair", "e:/logstash-7.1.1/bin/ruby", "e:/logstash-7.1.1/bin/setup.bat", "e:/logstash-7.1.1/bin/system-install", "e:/logstash-7.1.1/bin/test.conf"]}
   [2020-02-04T12:55:48,933][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"e:/logstash-7.1.1/bin/ncv.conf"}
   [2020-02-04T12:55:48,980][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
   [2020-02-04T12:55:48,990][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
   [2020-02-04T12:55:49,745][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::NoSuchMethodError", :message=>"com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;", :backtrace=>["org.logstash.plugins.discovery.Reflections.expandSuperTypes(Reflections.java:114)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:36)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:46)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:42)", "org.logstash.plugins.discovery.PluginRegistry.discoverPlugins(PluginRegistry.java:36)", "org.logstash.plugins.discovery.PluginRegistry.<clinit>(PluginRegistry.java:29)", "org.logstash.plugins.PluginLookup.lookup(PluginLookup.java:27)", "org.logstash.plugins.PluginFactoryExt$Plugins.plugin(PluginFactoryExt.java:200)", "org.logstash.plugins.PluginFactoryExt$Plugins.buildInput(PluginFactoryExt.java:117)", "org.logstash.config.ir.CompiledPipeline.lambda$setupInputs$1(CompiledPipeline.java:150)", "java.util.ArrayList.forEach(ArrayList.java:1249)", "org.logstash.config.ir.CompiledPipeline.setupInputs(CompiledPipeline.java:147)", "org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:80)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:50)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1154)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1141)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(E:/logstash-7.1.1/logstash-core/lib/logstash/java_pipeline.rb:23)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:296)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:82)", "org.jruby.RubyClass.newInstance(RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(E:/logstash-7.1.1/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(E:/logstash-7.1.1/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:325)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:77)", "org.jruby.runtime.Block.call(Block.java:124)", "org.jruby.RubyProc.call(RubyProc.java:295)", "org.jruby.RubyProc.call(RubyProc.java:274)", "org.jruby.RubyProc.call(RubyProc.java:270)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:745)"]}warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`          create at org/logstash/execution/ConvergeResultExt.java:109             add at org/logstash/execution/ConvergeResultExt.java:37  converge_state at E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338
   [2020-02-04T12:55:49,764][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't now how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338:in `block in converge_state'"]}

   [2020-02-04T12:55:49,789][DEBUG][logstash.agent           ] Starting puma
   [2020-02-04T12:55:49,800][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
   [2020-02-04T12:55:49,806][FATAL][logstash.runner          ] An unexpected erroroccurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338:in `block in converge_state'"]}
   [2020-02-04T12:55:49,866][DEBUG][logstash.api.service     ] [api-service] start
   [2020-02-04T12:55:49,881][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

   E:\logstash-7.1.1\bin>

@c95mbq - I believe you stripped-down the debug logs to remove the JDBC input configuration logs, can you confirm that?

This error is a bit cryptic and I am starting to wonder whether the problem is related to the JDBC input configuration or something else.

Can you perform the following tests? Hopefully that will help narrow the problem?

1. Test 1

Create a simple config file with the below configuration and run Logstash (e.g. logstash --debug -f e:\logstash-7.1.1\bin\simple.conf):

input {
  stdin { }
}

output {
  stdout { }
}

Any errors encountered?

Test 2

Modify the SQL statement in ncv.conf with a simpler SQL statement (e.g. SELECT * FROM TABLE_NAME) and run Logstash with the same command. Any errors encountered?

Test 3

Download the latest Logstash version and use the ncv.conf from the previous comment. Any errors encountered?

Thank you.

Thanks ropc for persisting!

The below is the error message received when creating and running the simple.conf file as per your instructions above - will post back test 2 and 3 shortly

(broken up into two parts)

E:\logstash-7.1.1\bin>logstash --debug -f e:\logstash-7.1.1\bin\simple.conf
Sending Logstash logs to E:/logstash-7.1.1/logs which is now configured via log4j2.properties
[2020-02-04T14:16:30,912][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"E:/logstash-7.1.1/modules/fb_apache/configuration"}
[2020-02-04T14:16:30,919][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x4b558f08 @directory="E:/logstash-7.1.1/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-04T14:16:30,922][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"E:/logstash-7.1.1/modules/netflow/configuration"}
[2020-02-04T14:16:30,923][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaff
old:0x221032e7 @directory="E:/logstash-7.1.1/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-04T14:16:31,028][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-02-04T14:16:31,029][DEBUG][logstash.runner          ] node.name: "RPA5874"
[2020-02-04T14:16:31,030][DEBUG][logstash.runner          ] *path.config: "e:\\logstash-7.1.1\\bin\\simple.conf"
[2020-02-04T14:16:31,030][DEBUG][logstash.runner          ] path.data: "E:/logstash-7.1.1/data"
[2020-02-04T14:16:31,031][DEBUG][logstash.runner          ] modules.cli: []
[2020-02-04T14:16:31,032][DEBUG][logstash.runner          ] modules: []
[2020-02-04T14:16:31,032][DEBUG][logstash.runner          ] modules_list: []
[2020-02-04T14:16:31,033][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-02-04T14:16:31,033][DEBUG][logstash.runner          ] modules_setup: false
[2020-02-04T14:16:31,034][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-02-04T14:16:31,035][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-02-04T14:16:31,035][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-02-04T14:16:31,036][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-02-04T14:16:31,037][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-02-04T14:16:31,037][DEBUG][logstash.runner          ] metric.collect: true
[2020-02-04T14:16:31,038][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-02-04T14:16:31,038][DEBUG][logstash.runner          ] pipeline.system: false
[2020-02-04T14:16:31,039][DEBUG][logstash.runner          ] pipeline.workers: 16
[2020-02-04T14:16:31,040][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-02-04T14:16:31,040][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-02-04T14:16:31,041][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-02-04T14:16:31,042][DEBUG][logstash.runner          ] pipeline.java_execution: true
[2020-02-04T14:16:31,042][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-02-04T14:16:31,043][DEBUG][logstash.runner          ] path.plugins: []
[2020-02-04T14:16:31,043][DEBUG][logstash.runner          ] config.debug: false
[2020-02-04T14:16:31,044][DEBUG][logstash.runner          ] *log.level: "debug"(default: "info")
[2020-02-04T14:16:31,045][DEBUG][logstash.runner          ] version: false
[2020-02-04T14:16:31,045][DEBUG][logstash.runner          ] help: false
[2020-02-04T14:16:31,046][DEBUG][logstash.runner          ] log.format: "plain"
[2020-02-04T14:16:31,046][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2020-02-04T14:16:31,047][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-02-04T14:16:31,048][DEBUG][logstash.runner          ] http.environment: "production"
[2020-02-04T14:16:31,048][DEBUG][logstash.runner          ] queue.type: "memory"

[2020-02-04T14:16:31,049][DEBUG][logstash.runner          ] queue.drain: false
[2020-02-04T14:16:31,049][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-02-04T14:16:31,050][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-02-04T14:16:31,051][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-02-04T14:16:31,051][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-02-04T14:16:31,052][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-02-04T14:16:31,053][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-02-04T14:16:31,053][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-02-04T14:16:31,054][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-02-04T14:16:31,054][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-02-04T14:16:31,055][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-02-04T14:16:31,056][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-02-04T14:16:31,056][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-02-04T14:16:31,057][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-02-04T14:16:31,058][DEBUG][logstash.runner          ] keystore.classname:"org.logstash.secret.store.backend.JavaKeyStore"
[2020-02-04T14:16:31,058][DEBUG][logstash.runner          ] keystore.file: "E:/logstash-7.1.1/config/logstash.keystore"
[2020-02-04T14:16:31,059][DEBUG][logstash.runner          ] path.queue: "E:/logstash-7.1.1/data/queue"
[2020-02-04T14:16:31,059][DEBUG][logstash.runner          ] path.dead_letter_queue: "E:/logstash-7.1.1/data/dead_letter_queue"
[2020-02-04T14:16:31,060][DEBUG][logstash.runner          ] path.settings: "E:/logstash-7.1.1/config"
[2020-02-04T14:16:31,061][DEBUG][logstash.runner          ] path.logs: "E:/logstash-7.1.1/logs"
[2020-02-04T14:16:31,062][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-02-04T14:16:31,063][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-02-04T14:16:31,064][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-02-04T14:16:31,067][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-02-04T14:16:31,068][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-02-04T14:16:31,069][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-04T14:16:31,070][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-02-04T14:16:31,071][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-02-04T14:16:31,072][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-02-04T14:16:31,073][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-02-04T14:16:31,074][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-02-04T14:16:31,075][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-02-04T14:16:31,076][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-04T14:16:31,077][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-02-04T14:16:31,078][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-02-04T14:16:31,079][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-02-04T14:16:31,080][DEBUG][logstash.runner          ] node.uuid: ""
[2020-02-04T14:16:31,081][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-02-04T14:16:31,126][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-04T14:16:31,142][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2020-02-04T14:16:31,185][DEBUG][logstash.agent           ] Setting up metric collection
[2020-02-04T14:16:31,251][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:16:31,261][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-02-04T14:16:31,408][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:16:31,527][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-02-04T14:16:31,533][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-02-04T14:16:31,547][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:16:31,557][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:16:31,607][DEBUG][logstash.agent           ] Starting agent
[2020-02-04T14:16:31,662][DEBUG][logstash.config.source.local.configpathloader]
Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["e:/logstash-7.1.1/bin/benchmark.sh", "e:/logstash-7.1.1/bin/cpdump", "e:/logstash-7.1.1/bin/dependencies-report", "e:/logstash-7.1.1/bin/ingest-convert.sh", "e:/logstash-7.1.1/bin/logstash", "e:/logstash-7.1.1/bin/logstash-keystore", "e:/logstash-7.1.1/bin/logstash-keystore.bat", "e:/logstash-7.1.1/bin/logstash-plugin", "e:/logstash-7.1.1/bin/logstash-plugin.bat", "e:/logstash-7.1.1/bin/logstash.bat", "e:/logstash-7.1.1/bin/logstash.lib.sh", "e:/logstash-7.1.1/bin/ncv.conf", "e:/logstash-7.1.1/bin/pqcheck", "e:/logstash-7.1.1/bin/pqrepair", "e:/logstash-7.1.1/bin/ruby", "e:/logstash-7.1.1/bin/setup.bat", "e:/logstash-7.1.1/bin/system-install", "e:/logstash-7.1.1/bin/test.conf"]}
[2020-02-04T14:16:31,666][DEBUG][logstash.config.source.local.configpathloader]Reading config file {:config_file=>"e:/logstash-7.1.1/bin/simple.conf"}
[2020-02-04T14:16:31,713][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-02-04T14:16:31,722][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}

part two of error message:

[2020-02-04T14:16:32,226][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::NoSuchMethodError", :message=>"com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;", :backtrace=>["org.logstash.plugins.discovery.Reflections.expandSuperTypes(Reflections.java:114)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:36)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:46)", "org.logstash.plugins.discovery.Reflections.<init>(Reflections.java:42)", "org.logstash.plugins.discovery.PluginRegistry.discoverPlugins(PluginRegistry.java:36)", "org.logstash.plugins.discovery.PluginRegistry.<clinit>(PluginRegistry.java:29)", "org.logstash.plugins.PluginLookup.lookup(PluginLookup.java:27)", "org.logstash.plugins.PluginFactoryExt$Plugins.plugin(PluginFactoryExt.java:200)", "org.logstash.plugins.PluginFactoryExt$Plugins.buildInput(PluginFactoryExt.java:117)", "org.logstash.config.ir.CompiledPipeline.lambda$setupInputs$1(CompiledPipeline.java:150)", "java.util.ArrayList.forEach(ArrayList.java:1249)", "org.logstash.config.ir.CompiledPipeline.setupInputs(CompiledPipeline.java:147)", "org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:80)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:50)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1154)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1141)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(E:/logstash-7.1.1/logstash-core/lib/logstash/java_pipeline.rb:23)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:296)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:82)", "org.jruby.RubyClass.newInstance(RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(E:/logstash-7.1.1/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(E:/logstash-7.1.1/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)","org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_1_dot_1.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:325)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:77)", "org.jruby.runtime.Block.call(Block.java:124)", "org.jruby.RubyProc.call(RubyProc.java:295)", "org.jruby.RubyProc.call(RubyProc.java:274)", "org.jruby.RubyProc.call(RubyProc.java:270)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:745)"]}
warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):
LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`
          create at org/logstash/execution/ConvergeResultExt.java:109
             add at org/logstash/execution/ConvergeResultExt.java:37
  converge_state at E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338
[2020-02-04T14:16:32,245][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338:in `block in converge_state'"]}

[2020-02-04T14:16:32,276][DEBUG][logstash.agent           ] Starting puma
[2020-02-04T14:16:32,290][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-02-04T14:16:32,292][FATAL][logstash.runner          ] An unexpected erroroccurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:338:in `block in converge_state'"]}
[2020-02-04T14:16:32,361][DEBUG][logstash.api.service     ] [api-service] start
[2020-02-04T14:16:32,374][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

E:\logstash-7.1.1\bin>

Test 3 with the original ncv.conf file with a simplified select statement reading

select * from v$database

Downloaded logstash-7.5.2

same error as before, split up over two replies - first part

E:\logstash-7.5.2\bin>logstash -f e:\logstash-7.5.2\bin\ncv.conf --debug
Thread.exclusive is deprecated, use Thread::Mutex
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/E:/tika-app-1.20.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/E:/logstash-7.5.2/logstash-core/lib/jars/log4j-slf4j-impl-2.11.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Sending Logstash logs to E:/logstash-7.5.2/logs which is now configured via log4j2.properties
[2020-02-04T14:34:36,307][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"E:/logstash-7.5.2/modules/fb_apache/configuration"}
[2020-02-04T14:34:36,432][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x218c6b23 @directory="E:/logstash-7.5.2/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-04T14:34:36,437][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"E:/logstash-7.5.2/modules/netflow/configuration"}
[2020-02-04T14:34:36,439][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5b6f8b25 @directory="E:/logstash-7.5.2/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-04T14:34:36,493][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"E:/logstash-7.5.2/data/queue"}
[2020-02-04T14:34:36,512][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"E:/logstash-7.5.2/data/dead_letter_queue"}
[2020-02-04T14:34:36,580][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-02-04T14:34:36,583][DEBUG][logstash.runner          ] node.name: "RPA5874"

[2020-02-04T14:34:36,585][DEBUG][logstash.runner          ] *path.config: "e:\\logstash-7.5.2\\bin\\ncv.conf"
[2020-02-04T14:34:36,587][DEBUG][logstash.runner          ] path.data: "E:/logstash-7.5.2/data"
[2020-02-04T14:34:36,589][DEBUG][logstash.runner          ] modules.cli: []
[2020-02-04T14:34:36,590][DEBUG][logstash.runner          ] modules: []
[2020-02-04T14:34:36,594][DEBUG][logstash.runner          ] modules_list: []
[2020-02-04T14:34:36,596][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-02-04T14:34:36,597][DEBUG][logstash.runner          ] modules_setup: false

[2020-02-04T14:34:36,599][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-02-04T14:34:36,601][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-02-04T14:34:36,602][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-02-04T14:34:36,604][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-02-04T14:34:36,606][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-02-04T14:34:36,608][DEBUG][logstash.runner          ] metric.collect: true

[2020-02-04T14:34:36,610][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-02-04T14:34:36,611][DEBUG][logstash.runner          ] pipeline.system: false
[2020-02-04T14:34:36,613][DEBUG][logstash.runner          ] pipeline.workers: 16

[2020-02-04T14:34:36,614][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-02-04T14:34:36,616][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-02-04T14:34:36,618][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-02-04T14:34:36,619][DEBUG][logstash.runner          ] pipeline.java_execution: true
[2020-02-04T14:34:36,621][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-02-04T14:34:36,622][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2020-02-04T14:34:36,624][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2020-02-04T14:34:36,625][DEBUG][logstash.runner          ] path.plugins: []
[2020-02-04T14:34:36,626][DEBUG][logstash.runner          ] config.debug: false
[2020-02-04T14:34:36,628][DEBUG][logstash.runner          ] *log.level: "debug"(default: "info")
[2020-02-04T14:34:36,630][DEBUG][logstash.runner          ] version: false
[2020-02-04T14:34:36,632][DEBUG][logstash.runner          ] help: false
[2020-02-04T14:34:36,635][DEBUG][logstash.runner          ] log.format: "plain"
[2020-02-04T14:34:36,636][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2020-02-04T14:34:36,638][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-02-04T14:34:36,639][DEBUG][logstash.runner          ] http.environment: "production"
[2020-02-04T14:34:36,642][DEBUG][logstash.runner          ] queue.type: "memory"

[2020-02-04T14:34:36,643][DEBUG][logstash.runner          ] queue.drain: false
[2020-02-04T14:34:36,645][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-02-04T14:34:36,647][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-02-04T14:34:36,648][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-02-04T14:34:36,650][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-02-04T14:34:36,652][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-02-04T14:34:36,653][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-02-04T14:34:36,655][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-02-04T14:34:36,658][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-02-04T14:34:36,662][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-02-04T14:34:36,664][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-02-04T14:34:36,665][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-02-04T14:34:36,666][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-02-04T14:34:36,667][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-02-04T14:34:36,668][DEBUG][logstash.runner          ] keystore.classname:"org.logstash.secret.store.backend.JavaKeyStore"
[2020-02-04T14:34:36,669][DEBUG][logstash.runner          ] keystore.file: "E:/logstash-7.5.2/config/logstash.keystore"
[2020-02-04T14:34:36,671][DEBUG][logstash.runner          ] path.queue: "E:/logstash-7.5.2/data/queue"
[2020-02-04T14:34:36,672][DEBUG][logstash.runner          ] path.dead_letter_queue: "E:/logstash-7.5.2/data/dead_letter_queue"
[2020-02-04T14:34:36,673][DEBUG][logstash.runner          ] path.settings: "E:/logstash-7.5.2/config"
[2020-02-04T14:34:36,675][DEBUG][logstash.runner          ] path.logs: "E:/logstash-7.5.2/logs"
[2020-02-04T14:34:36,676][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-02-04T14:34:36,678][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-02-04T14:34:36,679][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-02-04T14:34:36,680][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-02-04T14:34:36,681][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-02-04T14:34:36,682][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-04T14:34:36,683][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-02-04T14:34:36,684][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-02-04T14:34:36,685][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-02-04T14:34:36,686][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-02-04T14:34:36,688][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-02-04T14:34:36,689][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-02-04T14:34:36,690][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-04T14:34:36,691][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-02-04T14:34:36,692][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-02-04T14:34:36,693][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-02-04T14:34:36,694][DEBUG][logstash.runner          ] node.uuid: ""

second part

[2020-02-04T14:34:36,695][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-02-04T14:34:36,752][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-04T14:34:36,770][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.5.2"}
[2020-02-04T14:34:36,802][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"4bd72430-4caf-4739-a075-b1567c1bc236", :path=>"E:/logstash-7.5.2/data/uuid"}
[2020-02-04T14:34:36,829][DEBUG][logstash.agent           ] Setting up metric collection
[2020-02-04T14:34:36,896][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:34:36,907][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-02-04T14:34:37,058][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:34:37,176][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-02-04T14:34:37,182][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-02-04T14:34:37,199][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:34:37,211][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-04T14:34:37,268][DEBUG][logstash.agent           ] Starting agent
[2020-02-04T14:34:37,329][DEBUG][logstash.config.source.local.configpathloader]Skipping the following files while reading config since they don't match the spe
cified glob pattern {:files=>["e:/logstash-7.5.2/bin/benchmark.sh", "e:/logstash-7.5.2/bin/cpdump", "e:/logstash-7.5.2/bin/dependencies-report", "e:/logstash-7.5.2/bin/ingest-convert.sh", "e:/logstash-7.5.2/bin/logstash", "e:/logstash-7.5.2/bin/logstash-keystore", "e:/logstash-7.5.2/bin/logstash-keystore.bat", "e:/logstash-7.5.2/bin/logstash-plugin", "e:/logstash-7.5.2/bin/logstash-plugin.bat", "e:/logstash-7.5.2/bin/logstash.bat", "e:/logstash-7.5.2/bin/logstash.lib.sh", "e:/logstash-7.5.2/bin/pqcheck", "e:/logstash-7.5.2/bin/pqrepair", "e:/logstash-7.5.2/bin/ruby", "e:/logstash-7.5.2/bin/setup.bat", "e:/logstash-7.5.2/bin/system-install"]}
[2020-02-04T14:34:37,335][DEBUG][logstash.config.source.local.configpathloader]Reading config file {:config_file=>"e:/logstash-7.5.2/bin/ncv.conf"}
[2020-02-04T14:34:37,383][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-02-04T14:34:37,393][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2020-02-04T14:34:38,097][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore INFO  Reflections took 43 ms to scan 1 urls, producing 20 keys and 40 values
[2020-02-04T14:34:38,816][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"jdbc", :type=>"input", :class=>LogStash::Inputs::Jdbc}
[2020-02-04T14:34:39,046][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2020-02-04T14:34:39,077][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_0da03879-6ba6-41a9-91df-89dc88a4fcae"
[2020-02-04T14:34:39,079][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2020-02-04T14:34:39,080][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-02-04T14:34:39,106][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_user = "username"
[2020-02-04T14:34:39,107][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_paging_enabled = true
[2020-02-04T14:34:39,109][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validate_connection = true
[2020-02-04T14:34:39,112][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_password = <password>
[2020-02-04T14:34:39,114][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_page_size = 100000
[2020-02-04T14:34:39,115][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@statement = "select * from v$database"
[2020-02-04T14:34:39,116][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_library = "E:\\ojdbc6.jar"
[2020-02-04T14:34:39,118][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_connection_string = "jdbc:oracle:thin:@databasenode.domain.net:1521/scert.WORLD"
[2020-02-04T14:34:39,119][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@id = "f675b2ac2fecb1326be569fed3b9e57cc1e2eabfc7ebb0a11266aed4210cf9d1"
[2020-02-04T14:34:39,121][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_class = "Java::oracle.jdbc.driver.OracleDriver"
[2020-02-04T14:34:39,123][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@enable_metric = true
[2020-02-04T14:34:39,140][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@codec = <LogStash::Codecs::Plain id=>"plain_0da03879-6ba6-41a9-91df-89dc88a4fcae", enable_metric=>true, charset=>"UTF-8">
[2020-02-04T14:34:39,142][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@add_field = {}
[2020-02-04T14:34:39,144][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validation_timeout = 3600
[2020-02-04T14:34:39,145][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_pool_timeout = 5
[2020-02-04T14:34:39,146][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sequel_opts = {}
[2020-02-04T14:34:39,147][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sql_log_level = "info"
[2020-02-04T14:34:39,148][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts = 1
[2020-02-04T14:34:39,149][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts_wait_time = 0.5
[2020-02-04T14:34:39,150][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@plugin_timezone = "utc"
[2020-02-04T14:34:39,151][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@parameters = {}
[2020-02-04T14:34:39,158][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@last_run_metadata_path = "C:\\Users\\isdsupml/.logstash_jdbc_last_run"
[2020-02-04T14:34:39,160][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_column_value = false
[2020-02-04T14:34:39,161][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@tracking_column_type = "numeric"
[2020-02-04T14:34:39,162][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@clean_run = false
[2020-02-04T14:34:39,164][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@record_last_run = true
[2020-02-04T14:34:39,165][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@lowercase_column_names = true
[2020-02-04T14:34:39,167][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@columns_charset = {}
[2020-02-04T14:34:39,169][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_prepared_statements = false
[2020-02-04T14:34:39,170][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@prepared_statement_name = ""
[2020-02-04T14:34:39,171][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@prepared_statement_bind_values = []
[2020-02-04T14:34:39,215][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2020-02-04T14:34:39,239][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2020-02-04T14:34:39,249][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_2b15b585-ef2c-43f9-8c20-c7900047c773"
[2020-02-04T14:34:39,250][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2020-02-04T14:34:39,251][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2020-02-04T14:34:40,884][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "7d86cb147d036c9db789af888f1aaea57f64ecc122e15ae7efeba04764cdb7b7"
[2020-02-04T14:34:40,886][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2020-02-04T14:34:40,889][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_2b15b585-ef2c-43f9-8c20-c7900047c773", enable_metric=>true, metadata=>false>
[2020-02-04T14:34:40,890][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1

third part

[2020-02-04T14:34:41,152][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::NoSuchMethodError", :message=>"com.google.common.collect.TreeRangeMap.asDescendingMapOfRanges()Ljava/util/Map;", :backtrace=>["com.google.googlejavaformat.java.ModifierOrderer.applyReplacements(ModifierOrderer.java:161)", "com.google.googlejavaformat.java.ModifierOrderer.reorderModifiers(ModifierOrderer.java:139)", com.google.googlejavaformat.java.Formatter.getFormatReplacements(Formatter.java:180)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:163)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:149)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.generateCode(ComputeStepSyntaxElement.java:118)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.compile(ComputeStepSyntaxElement.java:85)", "org.logstash.config.ir.CompiledPipeline.lambda$getDatasetClass$2(CompiledPipeline.java:304)", "java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)", "org.logstash.config.ir.CompiledPipeline.getDatasetClass(CompiledPipeline.java:304)", "org.logstash.config.ir.CompiledPipeline.access$200(CompiledPipeline.java:50)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.outputDataset(CompiledPipeline.java:417)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.lambda$compile$1(CompiledPipeline.java:347)", "java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)", "java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)", "java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)", "java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)", "java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)", "java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)", "java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.compile(CompiledPipeline.java:348)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.<init>(CompiledPipeline.java:328)", "org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:117)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:60)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1156)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1143)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(E:/logstash-7.5.2/logstash-core/lib/logstash/java_pipeline.rb:27)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)","org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)","org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)","org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:326)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:77)", "org.jruby.runtime.Block.call(Block.java:129)", "org.jruby.RubyProc.call(RubyProc.java:295)", "org.jruby.RubyProc.call(RubyProc.java:274)", "org.jruby.RubyProc.call(RubyProc.java:270)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:745)"]}warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`          create at org/logstash/execution/ConvergeResultExt.java:109             add at org/logstash/execution/ConvergeResultExt.java:37  converge_state at E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339
[2020-02-04T14:34:41,168][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}

[2020-02-04T14:34:41,190][DEBUG][logstash.agent           ] Starting puma
[2020-02-04T14:34:41,195][FATAL][logstash.runner          ] An unexpected erroroccurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}
[2020-02-04T14:34:41,208][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-02-04T14:34:41,255][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

E:\logstash-7.5.2\bin>

Thanks for all your help with this. I haven't used logstash since v5 and that was just working without any issues, reading both from files and directly from database using jdbc.

Hi @c95mbq , thank you for running the above tests. I was trying to reproduce the issue in the lab and I had some discussion with the team about your issue.

The error reported in your comment happens during pipeline compilation, that's before the instantiation of plugins (like the error before). It may miss to apply a replacement in the pipeline with the JDBC plugin where the SQL query is select * from v$database and that $database is not valued anywhere.

Could you run two new tests with Logstash 7.5.2 version:

1. Test 1

Create a simple config file (e.g simple.conf) with the below configuration:

input {
  stdin { }
}

output {
  stdout { }
}

Run Logstash with the following command: bin/logstash --debug -f e:\logstash-7.5.2\bin\simple.conf and share the debug logs with us.

2. Test 2

Run Logstash with the JDBC config file but passing a value for the $database. For example:

  • export database=dns.to.db bin/logstash --debug -f <path_to_conf_file>
  • Share the debug logs with us as well the config file used for the test. (i.e you can strip down any credentials but keep the rest of the configuration unchanged).

Thank you for your collaboration.

Thanks again ropc for all your help with this, much appreciated.

The below is the first part of the log file for test 1 using the simple.conf definition you provided above.

E:\logstash-7.5.2\bin>logstash --debug -f e:\logstash-7.5.2\bin\simple.conf
Thread.exclusive is deprecated, use Thread::Mutex
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/E:/tika-app-1.20.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/E:/logstash-7.5.2/logstash-core/lib/jars/log4j-slf4j-impl-2.11.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Sending Logstash logs to E:/logstash-7.5.2/logs which is now configured via log4j2.properties
[2020-02-05T07:12:40,515][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"E:/logstash-7.5.2/modules/fb_apache/configuration"}
[2020-02-05T07:12:40,653][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x63cbb54f @directory="E:/logstash-7.5.2/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-05T07:12:40,658][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"E:/logstash-7.5.2/modules/netflow/configuration"}
[2020-02-05T07:12:40,660][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaff
old:0x6e067cc9 @directory="E:/logstash-7.5.2/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-05T07:12:40,782][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-02-05T07:12:40,784][DEBUG][logstash.runner          ] node.name: "RPA5874"
[2020-02-05T07:12:40,786][DEBUG][logstash.runner          ] *path.config: "e:\\logstash-7.5.2\\bin\\simple.conf"
[2020-02-05T07:12:40,788][DEBUG][logstash.runner          ] path.data: "E:/logstash-7.5.2/data"
[2020-02-05T07:12:40,789][DEBUG][logstash.runner          ] modules.cli: []
[2020-02-05T07:12:40,791][DEBUG][logstash.runner          ] modules: []
[2020-02-05T07:12:40,793][DEBUG][logstash.runner          ] modules_list: []
[2020-02-05T07:12:40,794][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-02-05T07:12:40,798][DEBUG][logstash.runner          ] modules_setup: false
[2020-02-05T07:12:40,800][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-02-05T07:12:40,801][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-02-05T07:12:40,803][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-02-05T07:12:40,804][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-02-05T07:12:40,806][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-02-05T07:12:40,807][DEBUG][logstash.runner          ] metric.collect: true

[2020-02-05T07:12:40,809][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-02-05T07:12:40,810][DEBUG][logstash.runner          ] pipeline.system: false
[2020-02-05T07:12:40,812][DEBUG][logstash.runner          ] pipeline.workers: 16

[2020-02-05T07:12:40,813][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-02-05T07:12:40,815][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-02-05T07:12:40,816][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-02-05T07:12:40,818][DEBUG][logstash.runner          ] pipeline.java_execution: true
[2020-02-05T07:12:40,820][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-02-05T07:12:40,821][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2020-02-05T07:12:40,822][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2020-02-05T07:12:40,824][DEBUG][logstash.runner          ] path.plugins: []
[2020-02-05T07:12:40,825][DEBUG][logstash.runner          ] config.debug: false
[2020-02-05T07:12:40,826][DEBUG][logstash.runner          ] *log.level: "debug"(default: "info")
[2020-02-05T07:12:40,828][DEBUG][logstash.runner          ] version: false
[2020-02-05T07:12:40,829][DEBUG][logstash.runner          ] help: false
[2020-02-05T07:12:40,831][DEBUG][logstash.runner          ] log.format: "plain"
[2020-02-05T07:12:40,832][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2020-02-05T07:12:40,834][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-02-05T07:12:40,835][DEBUG][logstash.runner          ] http.environment: "production"
[2020-02-05T07:12:40,836][DEBUG][logstash.runner          ] queue.type: "memory"

[2020-02-05T07:12:40,838][DEBUG][logstash.runner          ] queue.drain: false
[2020-02-05T07:12:40,841][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-02-05T07:12:40,842][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-02-05T07:12:40,844][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-02-05T07:12:40,845][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-02-05T07:12:40,846][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-02-05T07:12:40,847][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-02-05T07:12:40,848][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-02-05T07:12:40,850][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-02-05T07:12:40,851][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-02-05T07:12:40,853][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-02-05T07:12:40,854][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-02-05T07:12:40,855][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-02-05T07:12:40,856][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-02-05T07:12:40,857][DEBUG][logstash.runner          ] keystore.classname:"org.logstash.secret.store.backend.JavaKeyStore"
[2020-02-05T07:12:40,859][DEBUG][logstash.runner          ] keystore.file: "E:/logstash-7.5.2/config/logstash.keystore"
[2020-02-05T07:12:40,860][DEBUG][logstash.runner          ] path.queue: "E:/logstash-7.5.2/data/queue"
[2020-02-05T07:12:40,861][DEBUG][logstash.runner          ] path.dead_letter_queue: "E:/logstash-7.5.2/data/dead_letter_queue"
[2020-02-05T07:12:40,862][DEBUG][logstash.runner          ] path.settings: "E:/logstash-7.5.2/config"
[2020-02-05T07:12:40,863][DEBUG][logstash.runner          ] path.logs: "E:/logstash-7.5.2/logs"
[2020-02-05T07:12:40,864][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-02-05T07:12:40,866][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-02-05T07:12:40,867][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-02-05T07:12:40,868][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-02-05T07:12:40,870][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-02-05T07:12:40,871][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-05T07:12:40,872][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-02-05T07:12:40,873][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-02-05T07:12:40,874][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-02-05T07:12:40,875][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-02-05T07:12:40,876][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-02-05T07:12:40,877][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-02-05T07:12:40,878][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-05T07:12:40,879][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-02-05T07:12:40,880][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-02-05T07:12:40,881][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-02-05T07:12:40,882][DEBUG][logstash.runner          ] node.uuid: ""
[2020-02-05T07:12:40,883][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-02-05T07:12:40,933][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-05T07:12:40,949][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.5.2"}
[2020-02-05T07:12:41,007][DEBUG][logstash.agent           ] Setting up metric collection
[2020-02-05T07:12:41,076][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T07:12:41,088][DEBUG][logstash.instrument.periodicpoller.cgroup] Oneor more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-02-05T07:12:41,244][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T07:12:41,372][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-02-05T07:12:41,379][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-02-05T07:12:41,397][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T07:12:41,408][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T07:12:41,469][DEBUG][logstash.agent           ] Starting agent
[2020-02-05T07:12:41,533][DEBUG][logstash.config.source.local.configpathloader]Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["e:/logstash-7.5.2/bin/benchmark.sh", "e:/logstash-7.5.2/bin/cpdump", "e:/logstash-7.5.2/bin/dependencies-report", "e:/logstash-7.5.2/bin/ingest-convert.sh", "e:/logstash-7.5.2/bin/logstash", "e:/logstash-7.5.2/bin/logstash-keystore", "e:/logstash-7.5.2/bin/logstash-keystore.bat", "e:/logstash-7.5.2/bin/logstash-plugin", "e:/logstash-7.5.2/bin/logstash-plugin.bat", "e:/logstash-7.5.2/bin/logstash.bat", "e:/logstash-7.5.2/bin/logstash.lib.sh", "e:/logstash-7.5.2/bin/ncv.conf", "e:/logstash-7.5.2/bin/pqcheck", "e:/logstash-7.5.2/bin/pqrepair", "e:/logstash-7.5.2/bin/ruby", "e:/logstash-7.5.2/bin/setup.bat", "e:/logstash-7.5.2/bin/system-install"]}
[2020-02-05T07:12:41,538][DEBUG][logstash.config.source.local.configpathloader]Reading config file {:config_file=>"e:/logstash-7.5.2/bin/simple.conf"}
[2020-02-05T07:12:41,586][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-02-05T07:12:41,596][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2020-02-05T07:12:42,061][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStoreINFO  Reflections took 44 ms to scan 1 urls, producing 20 keys and 40 values

second part of the log file for test 1

[2020-02-05T07:12:42,754][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdin", :type=>"input", :class=>LogStash::Inputs::Stdin}
[2020-02-05T07:12:42,949][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"line", :type=>"codec", :class=>LogStash::Codecs::Line}
[2020-02-05T07:12:42,984][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@id = "line_7c2e3a8c-d202-4364-b96d-3de5bf8aef76"
[2020-02-05T07:12:42,986][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@enable_metric = true
[2020-02-05T07:12:42,987][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@charset = "UTF-8"
[2020-02-05T07:12:42,988][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@delimiter = "\n"
[2020-02-05T07:12:43,016][DEBUG][logstash.inputs.stdin    ] config LogStash::Inputs::Stdin/@id = "fc706eef3fe437e3d59a7cf14ac21130e50a5c11c89034db40227cfa80adac9f"
[2020-02-05T07:12:43,018][DEBUG][logstash.inputs.stdin    ] config LogStash::Inputs::Stdin/@enable_metric = true
[2020-02-05T07:12:43,034][DEBUG][logstash.inputs.stdin    ] config LogStash::Inputs::Stdin/@codec = <LogStash::Codecs::Line id=>"line_7c2e3a8c-d202-4364-b96d-3de5bf8aef76", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2020-02-05T07:12:43,036][DEBUG][logstash.inputs.stdin    ] config LogStash::Inputs::Stdin/@add_field = {}
[2020-02-05T07:12:43,074][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2020-02-05T07:12:43,099][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2020-02-05T07:12:43,107][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_dc826dcc-75f5-4e0c-934b-13a2bc1761fc"
[2020-02-05T07:12:43,109][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2020-02-05T07:12:43,110][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2020-02-05T07:12:44,700][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "9e320aa8e3f85f20b151d41490adeced75686c9b5d58a9503fcba047e2575d03"
[2020-02-05T07:12:44,702][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2020-02-05T07:12:44,706][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_dc826dcc-75f5-4e0c-934b-13a2bc1761fc", enable_metric=>true, metadata=>false>
[2020-02-05T07:12:44,707][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2020-02-05T07:12:44,975][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::NoSuchMethodError", :message=>"com.google.common.collect.TreeRangeMap.asDescendingMapOfRanges()Ljava/util/Map;", :backtrace=>["com.google.googlejavaformat.java.ModifierOrderer.applyReplacements(ModifierOrderer.java:161)", "com.google.googlejavaformat.java.ModifierOrderer.reorderModifiers(ModifierOrderer.java:139)", "com.google.googlejavaformat.java.Formatter.getFormatReplacements(Formatter.java:180)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:163)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:149)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.generateCode(ComputeStepSyntaxElement.java:118)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.compile(ComputeStepSyntaxElement.java:85)", "org.logstash.config.ir.CompiledPipeline.lambda$getDatasetClass$2(CompiledPipeline.java:304)", "java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)", "org.logstash.config.ir.CompiledPipeline.getDatasetClass(CompiledPipeline.java:304)", "org.logstash.config.ir.CompiledPipeline.access$200(CompiledPipeline.java:50)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.outputDataset(CompiledPipeline.java:417)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.lambda$compile$1(CompiledPipeline.java:347)", "java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)", "java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)", "java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)", "java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)", "java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)", "java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)", "java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.compile(CompiledPipeline.java:348)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.<init>(CompiledPipeline.java:328)", "org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:117)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:60)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1156)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1143)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(E:/logstash-7.5.2/logstash-core/lib/logstash/java_pipeline.rb:27)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:326)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:77)", "org.jruby.runtime.Block.call(Block.java:129)", "org.jruby.RubyProc.call(RubyProc.java:295)", "org.jruby.RubyProc.call(RubyProc.java:274)", "org.jruby.RubyProc.call(RubyProc.java:270)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:745)"]}warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`         create at org/logstash/execution/ConvergeResultExt.java:109             add at org/logstash/execution/ConvergeResultExt.java:37  converge_state at E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339
[2020-02-05T07:12:44,985][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}
[2020-02-05T07:12:45,004][DEBUG][logstash.agent           ] Starting puma
[2020-02-05T07:12:45,017][FATAL][logstash.runner          ] An unexpected erroroccurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}
[2020-02-05T07:12:45,019][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-02-05T07:12:45,074][DEBUG][logstash.api.service     ] [api-service] start
[2020-02-05T07:12:45,075][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

E:\logstash-7.5.2\bin>

Apologies for adding this but I figured I'd setup logstash on a different disk for the heck of it and using the same simple.conf file as before, this did seem to run ok without any issues, which surely doesn't make any sense at all. The below shows the tail of the log.

[2020-02-05T07:29:56,953][INFO ][logstash.config.source.local.configpathloader]No config files found in path {:path=>"c:/logstash-7.5.2/bin/simple.conf"}
[2020-02-05T07:29:56,961][ERROR][logstash.config.sourceloader] No configurationfound in the configured sources.
[2020-02-05T07:29:56,983][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2020-02-05T07:29:57,013][DEBUG][logstash.agent           ] Starting puma
[2020-02-05T07:29:57,025][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2020-02-05T07:29:57,027][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-02-05T07:29:57,046][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2020-02-05T07:29:57,049][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2020-02-05T07:29:57,051][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2020-02-05T07:29:57,058][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
[2020-02-05T07:29:57,061][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2020-02-05T07:29:57,078][DEBUG][logstash.api.service     ] [api-service] start
[2020-02-05T07:29:57,312][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-05T07:30:02,175][INFO ][logstash.runner          ] Logstash shut down.

Anyway, I'll go ahead with test 2 shortly.

ok so test 2 and, to be honest, I'm not even sure I'm doing this right.

running on windows so setting database to

jdbc:oracle:thin:@name_of_node.net:1521/service_name

ncv2.conf file content

  input {
  jdbc {
    jdbc_driver_library => "E:\ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "${database}"
    jdbc_user => "USER"
    jdbc_password => "password"
    jdbc_validate_connection => true
    jdbc_paging_enabled => true
    jdbc_page_size => "100000"    
    statement => "select * from person where updt_dt_tm > trunc(sysdate)"
  }
}
output {
  stdout { }
}

note that I've changed the statement so that I now query a non-system table. I've verified in SQL developer that this returns six rows.

I'll post the logs when running e:\logstash-7.5.2\bin\logstash --debug -f e:\logstash-7.5.2\bin\ncv2.conf

in the next two posts

first part of log

E:\logstash-7.5.2\bin>logstash --debug -f e:\logstash-7.5.2\bin\ncv2.conf
Thread.exclusive is deprecated, use Thread::Mutex
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/E:/tika-app-1.20.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/E:/logstash-7.5.2/logstash-core/lib/jars/log4j-slf4j-impl-2.11.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Sending Logstash logs to E:/logstash-7.5.2/logs which is now configured via log4j2.properties
[2020-02-05T09:11:37,510][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"E:/logstash-7.5.2/modules/fb_apache/configuration"}
[2020-02-05T09:11:37,622][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x218c6b23 @directory="E:/logstash-7.5.2/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-05T09:11:37,628][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"E:/logstash-7.5.2/modules/netflow/configuration"}
[2020-02-05T09:11:37,631][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5b6f8b25 @directory="E:/logstash-7.5.2/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-02-05T09:11:37,757][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-02-05T09:11:37,760][DEBUG][logstash.runner          ] node.name: "RPA5874"

[2020-02-05T09:11:37,762][DEBUG][logstash.runner          ] *path.config: "e:\\logstash-7.5.2\\bin\\ncv2.conf"
[2020-02-05T09:11:37,763][DEBUG][logstash.runner          ] path.data: "E:/logstash-7.5.2/data"
[2020-02-05T09:11:37,765][DEBUG][logstash.runner          ] modules.cli: []
[2020-02-05T09:11:37,767][DEBUG][logstash.runner          ] modules: []
[2020-02-05T09:11:37,769][DEBUG][logstash.runner          ] modules_list: []
[2020-02-05T09:11:37,771][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-02-05T09:11:37,774][DEBUG][logstash.runner          ] modules_setup: false

[2020-02-05T09:11:37,776][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-02-05T09:11:37,777][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-02-05T09:11:37,779][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-02-05T09:11:37,780][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-02-05T09:11:37,782][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-02-05T09:11:37,783][DEBUG][logstash.runner          ] metric.collect: true

[2020-02-05T09:11:37,784][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-02-05T09:11:37,786][DEBUG][logstash.runner          ] pipeline.system: false
[2020-02-05T09:11:37,788][DEBUG][logstash.runner          ] pipeline.workers: 16

[2020-02-05T09:11:37,790][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-02-05T09:11:37,793][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-02-05T09:11:37,794][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-02-05T09:11:37,796][DEBUG][logstash.runner          ] pipeline.java_execution: true
[2020-02-05T09:11:37,797][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-02-05T09:11:37,798][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2020-02-05T09:11:37,800][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2020-02-05T09:11:37,801][DEBUG][logstash.runner          ] path.plugins: []
[2020-02-05T09:11:37,802][DEBUG][logstash.runner          ] config.debug: false
[2020-02-05T09:11:37,804][DEBUG][logstash.runner          ] *log.level: "debug"(default: "info")
[2020-02-05T09:11:37,805][DEBUG][logstash.runner          ] version: false
[2020-02-05T09:11:37,806][DEBUG][logstash.runner          ] help: false
[2020-02-05T09:11:37,808][DEBUG][logstash.runner          ] log.format: "plain"
[2020-02-05T09:11:37,809][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2020-02-05T09:11:37,811][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-02-05T09:11:37,812][DEBUG][logstash.runner          ] http.environment: "production"
[2020-02-05T09:11:37,813][DEBUG][logstash.runner          ] queue.type: "memory"

[2020-02-05T09:11:37,815][DEBUG][logstash.runner          ] queue.drain: false
[2020-02-05T09:11:37,817][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-02-05T09:11:37,819][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-02-05T09:11:37,820][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-02-05T09:11:37,821][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-02-05T09:11:37,822][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-02-05T09:11:37,823][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-02-05T09:11:37,825][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-02-05T09:11:37,826][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-02-05T09:11:37,828][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-02-05T09:11:37,829][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-02-05T09:11:37,830][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-02-05T09:11:37,831][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-02-05T09:11:37,832][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-02-05T09:11:37,834][DEBUG][logstash.runner          ] keystore.classname:"org.logstash.secret.store.backend.JavaKeyStore"
[2020-02-05T09:11:37,835][DEBUG][logstash.runner          ] keystore.file: "E:/logstash-7.5.2/config/logstash.keystore"
[2020-02-05T09:11:37,836][DEBUG][logstash.runner          ] path.queue: "E:/logstash-7.5.2/data/queue"
[2020-02-05T09:11:37,837][DEBUG][logstash.runner          ] path.dead_letter_queue: "E:/logstash-7.5.2/data/dead_letter_queue"
[2020-02-05T09:11:37,838][DEBUG][logstash.runner          ] path.settings: "E:/logstash-7.5.2/config"
[2020-02-05T09:11:37,839][DEBUG][logstash.runner          ] path.logs: "E:/logstash-7.5.2/logs"
[2020-02-05T09:11:37,839][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-02-05T09:11:37,841][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-02-05T09:11:37,843][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-02-05T09:11:37,845][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-02-05T09:11:37,846][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-02-05T09:11:37,847][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-05T09:11:37,849][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-02-05T09:11:37,850][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-02-05T09:11:37,850][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-02-05T09:11:37,851][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-02-05T09:11:37,852][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-02-05T09:11:37,853][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-02-05T09:11:37,854][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-02-05T09:11:37,855][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-02-05T09:11:37,856][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-02-05T09:11:37,857][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-02-05T09:11:37,858][DEBUG][logstash.runner          ] node.uuid: ""
[2020-02-05T09:11:37,859][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-02-05T09:11:37,907][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-05T09:11:37,923][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.5.2"}
[2020-02-05T09:11:37,976][DEBUG][logstash.agent           ] Setting up metric collection
[2020-02-05T09:11:38,045][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T09:11:38,057][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-02-05T09:11:38,222][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T09:11:38,340][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-02-05T09:11:38,347][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-02-05T09:11:38,362][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T09:11:38,373][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-02-05T09:11:38,430][DEBUG][logstash.agent           ] Starting agent
[2020-02-05T09:11:38,490][DEBUG][logstash.config.source.local.configpathloader]
Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["e:/logstash-7.5.2/bin/benchmark.sh", "e:/logstash-7.5.2/bin/cpdump", "e:/logstash-7.5.2/bin/dependencies-report", "e:/logstash-7.5.2/bin/ingest-convert.sh", "e:/logstash-7.5.2/bin/logstash", "e:/logstash-7.5.2/bin/logstash-keystore", "e:/logstash-7.5.2/bin/logstash-keystore.bat", "e:/logstash-7.5.2/bin/logstash-plugin", "e:/logstash-7.5.2/bin/logstash-plugin.bat", "e:/logstash-7.5.2/bin/logstash.bat", "e:/logstash-7.5.2/bin/logstash.lib.sh", "e:/logstash-7.5.2/bin/pqcheck", "e:/logstash-7.5.2/bin/pqrepair", "e:/logstash-7.5.2/bin/ruby", "e:/logstash-7.5.2/bin/setup.bat", "e:/logstash-7.5.2/bin/simple.conf", "e:/logstash-7.5.2/bin/system-install"]}
[2020-02-05T09:11:38,495][DEBUG][logstash.config.source.local.configpathloader]Reading config file {:config_file=>"e:/logstash-7.5.2/bin/ncv2.conf"}

second part of log

[2020-02-05T09:11:38,540][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-02-05T09:11:38,550][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2020-02-05T09:11:39,236][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStoreINFO  Reflections took 41 ms to scan 1 urls, producing 20 keys and 40 values
[2020-02-05T09:11:39,948][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"jdbc", :type=>"input", :class=>LogStash::Inputs::Jdbc}
[2020-02-05T09:11:40,040][DEBUG][logstash.inputs.jdbc     ] Replacing `${database}` with actual value
[2020-02-05T09:11:40,045][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2020-02-05T09:11:40,586][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2020-02-05T09:11:40,623][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_2a98bf94-ee1b-4e98-9d96-fcfc79951022"
[2020-02-05T09:11:40,625][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2020-02-05T09:11:40,626][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-02-05T09:11:40,649][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_user = "USER"
[2020-02-05T09:11:40,651][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_paging_enabled = true
[2020-02-05T09:11:40,652][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validate_connection = true
[2020-02-05T09:11:40,654][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_password = <password>
[2020-02-05T09:11:40,655][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_page_size = 100000
[2020-02-05T09:11:40,656][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@statement = "select * from person where updt_dt_tm > trunc(sysdate)"
[2020-02-05T09:11:40,657][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_library = "E:\\ojdbc6.jar"
[2020-02-05T09:11:40,658][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_connection_string = "jdbc:oracle:thin:@name_of_node.net:1521/service_name"
[2020-02-05T09:11:40,659][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@id = "13ed5c38d05ac24bbd6701df6100ed7254fe8510bc82cd3bd15c6847deda5e0d"
[2020-02-05T09:11:40,660][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_class = "Java::oracle.jdbc.driver.OracleDriver"
[2020-02-05T09:11:40,661][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@enable_metric = true
[2020-02-05T09:11:40,677][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@codec = <LogStash::Codecs::Plain id=>"plain_2a98bf94-ee1b-4e98-9d96-fcfc79951022", enable_metric=>true, charset=>"UTF-8">
[2020-02-05T09:11:40,678][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@add_field = {}
[2020-02-05T09:11:40,679][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validation_timeout = 3600
[2020-02-05T09:11:40,680][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_pool_timeout = 5
[2020-02-05T09:11:40,681][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sequel_opts = {}
[2020-02-05T09:11:40,682][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sql_log_level = "info"
[2020-02-05T09:11:40,683][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts = 1
[2020-02-05T09:11:40,684][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts_wait_time = 0.5
[2020-02-05T09:11:40,685][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@plugin_timezone = "utc"
[2020-02-05T09:11:40,686][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@parameters = {}
[2020-02-05T09:11:40,687][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@last_run_metadata_path = "C:\\Users\\isdsupml/.logstash_jdbc_last_run"
[2020-02-05T09:11:40,693][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_column_value = false
[2020-02-05T09:11:40,695][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@tracking_column_type = "numeric"
[2020-02-05T09:11:40,696][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@clean_run = false
[2020-02-05T09:11:40,697][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@record_last_run = true
[2020-02-05T09:11:40,697][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@lowercase_column_names = true
[2020-02-05T09:11:40,698][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@columns_charset = {}
[2020-02-05T09:11:40,700][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_prepared_statements = false
[2020-02-05T09:11:40,701][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@prepared_statement_name = ""
[2020-02-05T09:11:40,702][DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@prepared_statement_bind_values = []
[2020-02-05T09:11:40,741][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2020-02-05T09:11:40,764][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2020-02-05T09:11:40,772][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_14babc30-d7b0-41a2-b21e-8437af6c1425"
[2020-02-05T09:11:40,773][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2020-02-05T09:11:40,774][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2020-02-05T09:11:42,285][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "91cb37bd96ecbf10170fc9701ad31400008e6b3aa765fbe7f209e4281853a42a"
[2020-02-05T09:11:42,286][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2020-02-05T09:11:42,289][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_14babc30-d7b0-41a2-b21e-8437af6c1425", enable_metric=>true, metadata=>false>
[2020-02-05T09:11:42,290][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2020-02-05T09:11:42,553][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::NoSuchMethodError", :message=>"com.google.common.collect.TreeRangeMap.asDescendingMapOfRanges()Ljava/util/Map;", :backtrace=>["com.google.googlejavaformat.java.ModifierOrderer.applyReplacements(ModifierOrderer.java:161)", "com.google.googlejavaformat.java.ModifierOrderer.reorderModifiers(ModifierOrderer.java:139)", "com.google.googlejavaformat.java.Formatter.getFormatReplacements(Formatter.java:180)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:163)", "com.google.googlejavaformat.java.Formatter.formatSource(Formatter.java:149)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.generateCode(ComputeStepSyntaxElement.java:118)", "org.logstash.config.ir.compiler.ComputeStepSyntaxElement.compile(ComputeStepSyntaxElement.java:85)", "org.logstash.config.ir.CompiledPipeline.lambda$getDatasetClass$2(CompiledPipeline.java:304)", "java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)", "org.logstash.config.ir.CompiledPipeline.getDatasetClass(CompiledPipeline.java:304)", "org.logstash.config.ir.CompiledPipeline.access$200(CompiledPipeline.java:50)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.outputDataset(CompiledPipeline.java:417)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.lambda$compile$1(CompiledPipeline.java:347)", "java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)", "java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)", "java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)", "java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)", "java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)", "java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)", "java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.compile(CompiledPipeline.java:348)", "org.logstash.config.ir.CompiledPipeline$CompiledExecution.<init>(CompiledPipeline.java:328)", "org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:117)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:60)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1156)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1143)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(E:/logstash-7.5.2/logstash-core/lib/logstash/java_pipeline.rb:27)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(E:/logstash-7.5.2/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)", "E_3a_.logstash_minus_7_dot_5_dot_2.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:326)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:77)", "org.jruby.runtime.Block.call(Block.java:129)", "org.jruby.RubyProc.call(RubyProc.java:295)", "org.jruby.RubyProc.call(RubyProc.java:274)", "org.jruby.RubyProc.call(RubyProc.java:270)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:745)"]}warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`          create at org/logstash/execution/ConvergeResultExt.java:109             add at org/logstash/execution/ConvergeResultExt.java:37  converge_state at E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339
[2020-02-05T09:11:42,562][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}

third part of log

[2020-02-05T09:11:42,582][DEBUG][logstash.agent           ] Starting puma
[2020-02-05T09:11:42,596][FATAL][logstash.runner          ] An unexpected erroroccurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::NoSuchMethodError` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "E:/logstash-7.5.2/logstash-core/lib/logstash/agent.rb:339:in `block in converge_state'"]}
[2020-02-05T09:11:42,598][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-02-05T09:11:42,649][DEBUG][logstash.api.service     ] [api-service] start
[2020-02-05T09:11:42,655][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

I found a post logstash-6.7.0 failing to start that suggested the content of classpath environment variable could be at fault.

I temporarily set classpath to "" in the command line window before running the ncv.conf file that uses jdbc to connect to the database and that saw logstash running fine with data returned. I have no idea why that is and what in classpath was causing the issue but this is what my classpath looks like before blanking it

C:\Program Files (x86)\IBM\WebSphere MQ\Java\lib\com.ibm.mqjms.jar;C:\Program Files (x86)\IBM\WebSphere MQ\Java\lib\com.ibm.mq.jar;e:\app\ojdbc\;e:\app\ojdbc\ojdbc6_g.jar;e:\tika-app-1.20.jar;.

Hi @c95mbq - Thanks again for running these tests and for experimenting with the CLASSPATH environment variable. I need to discuss with the team regarding the CLASSPATH environment variable and how it is set in Logstash running in Windows environment.

@c95mbq - I discussed with the team and this is a bug in the Windows batch script. I opened this Github issue. I invite you to subscribe to it for further updates. Thank you for collaboration in this topic - I am glad we found the problem :slightly_smiling_face:

That's great, thanks heaps for all your help ropc, this was driving me mad!

Now it's back to the original problem of trying to secure logstash so I'm sure I'll be making another post shortly :wink:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.