Logstash-input-jdbc doesn't work on logstash 5.x

I installed logstash 5.x and I have a problem with logstah-input-jdbc plugin.

My simple.conf works well while jdbc input is commented and it parses logs properly, but when it is enabled, logstash doesn't work (configtest is ok and the service runs), even the logs parsing that were ok previously now don't work.

This is my jdbc input configuration, it was working ok on logstash 2.4:

  jdbc {
    type => "oracle"
    jdbc_driver_library => "/usr/share/logstash/vendor/jar/jdbc/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver""
    jdbc_user => "usr"
    jdbc_password => "pass"
    jdbc_connection_string => "jdbc:oracle:thin:usr/pass@//00.00.00.00:1521/xxx.xxxxx.xxx"
    jdbc_validate_connection => true
    statement => "SELECT * FROM H"
    #schedule => "36 10 * * *" 
    }

Where logstash-input-jdbc plugin version is 4.1.3 and:

-rwxrw-rw- 1 logstash logstash 1,9M mar 21  2013 /usr/share/logstash/vendor/jar/jdbc/ojdbc6.jar

Can you help me? Where is the problem?

Best regards.

Have you looked for clues in the Logstash logs? Does it help if you increase the logging verbosity?

Hello, I didn't see anything in the log:
This is vervose log information:

[2016-12-20T10:18:51,613][INFO ][logstash.inputs.lumberjack] Starting lumberjack input listener {:address=>"0.0.0.0:5043"}
[2016-12-20T10:18:53,062][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2016-12-20T10:18:53,206][INFO ][org.logstash.beats.Server] Starting server on port: 5044

And this is the degub par of jdbc plugin is:

[DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"jdbc", :type=>"input", :class=>LogStash::Inputs::Jdbc}
[DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_bf6587e1-6c70-473e-adcc-62d8aa7d59b9"
[DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@type = "oracle"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_library = "/usr/share/logstash/vendor/jar/jdbc/ojdbc6.jar"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_class = "Java::oracle.jdbc.driver.OracleDriver"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_user = "usr"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_password = <password>
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_connection_string = "jdbc:oracle:thin:usr/pass@//xx.xx.xx.xx:1521/ordwh"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validate_connection = true
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@statement = "SELECT * FROM H"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@id = "bb3cca8f09d32b674748386967c728ceee1897a3-3"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@enable_metric = true
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@codec = <LogStash::Codecs::Plain id=>"plain_bf6587e1-6c70-473e-adcc-62d8aa7d59b9", enable_metric=>true, charset=>"UTF-8">
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@add_field = {}
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_paging_enabled = false
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_page_size = 100000
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validation_timeout = 3600
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_pool_timeout = 5
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sequel_opts = {}
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sql_log_level = "info"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts = 1
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts_wait_time = 0.5
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@parameters = {}
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@last_run_metadata_path = "/root/.logstash_jdbc_last_run"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_column_value = false
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@tracking_column_type = "numeric"
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@clean_run = false
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@record_last_run = true
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@lowercase_column_names = true
[DEBUG][logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@columns_charset = {}
[DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
...
[DEBUG][logstash.agent           ] starting agent
[DEBUG][logstash.agent           ] starting pipeline {:id=>"main"}

Could you tell me anything about this problem?
Thanks

What happens after the pipeline is started?

This is the message:

[2016-12-20T18:28:16,141][INFO ][logstash.inputs.lumberjack] Starting lumberjack input listener {:address=>"0.0.0.0:5043"}
[2016-12-20T18:28:16,650][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-12-20 18:28:16 +0100}
[2016-12-20T18:28:16,722][DEBUG][io.netty.util.internal.logging.InternalLoggerFactory] Using Log4J as the default logging framework
[2016-12-20T18:28:16,729][DEBUG][io.netty.util.internal.NativeLibraryLoader] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
[2016-12-20T18:28:16,729][DEBUG][io.netty.util.internal.NativeLibraryLoader] -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
[2016-12-20T18:28:16,817][DEBUG][io.netty.util.internal.ThreadLocalRandom] -Dio.netty.initialSeedUniquifier: 0x2ee497e4adecafb9
[2016-12-20T18:28:17,158][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Buffer.address: available
[2016-12-20T18:28:17,158][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
[2016-12-20T18:28:17,158][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
[2016-12-20T18:28:17,159][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Bits.unaligned: true
[2016-12-20T18:28:17,159][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): available
[2016-12-20T18:28:17,161][DEBUG][io.netty.util.internal.Cleaner0] java.nio.ByteBuffer.cleaner(): available
[2016-12-20T18:28:17,161][DEBUG][io.netty.util.internal.PlatformDependent] Java version: 8
[2016-12-20T18:28:17,161][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noUnsafe: false
[2016-12-20T18:28:17,162][DEBUG][io.netty.util.internal.PlatformDependent] sun.misc.Unsafe: available
[2016-12-20T18:28:17,162][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noJavassist: false
[2016-12-20T18:28:17,262][DEBUG][io.netty.util.internal.PlatformDependent] Javassist: available
[2016-12-20T18:28:17,262][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
[2016-12-20T18:28:17,263][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
[2016-12-20T18:28:17,263][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noPreferDirect: false
[2016-12-20T18:28:17,263][DEBUG][io.netty.util.internal.PlatformDependent] io.netty.maxDirectMemory: 1038876672 bytes
[2016-12-20T18:28:17,274][DEBUG][io.netty.buffer.AbstractByteBuf] -Dio.netty.buffer.bytebuf.checkAccessible: true
[2016-12-20T18:28:17,277][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
[2016-12-20T18:28:17,277][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.maxRecords: 4
[2016-12-20T18:28:17,279][DEBUG][io.netty.util.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@1f34a2b7
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 8
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 8
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
[2016-12-20T18:28:17,297][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
[2016-12-20T18:28:17,298][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
[2016-12-20T18:28:17,298][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
[2016-12-20T18:28:17,310][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.allocator.type: pooled
[2016-12-20T18:28:17,310][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 65536
[2016-12-20T18:28:17,310][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
[2016-12-20T18:28:17,325][DEBUG][io.netty.handler.ssl.OpenSslContext] Default cipher suite (OpenSSL): [ECDHE-RSA-AES128-GCM-SHA256, ECDHE-RSA-AES128-SHA, ECDHE-RSA-AES256-SHA, AES128-GCM-SHA256, AES128-SHA, AES256-SHA, DES-CBC3-SHA]
[2016-12-20T18:28:17,326][DEBUG][io.netty.handler.ssl.OpenSslContext] OpenSslContext only support -Djdk.tls.ephemeralDHKeySize={int}, but got: matched
[2016-12-20T18:28:17,334][DEBUG][io.netty.util.Recycler   ] -Dio.netty.recycler.maxCapacity: 262144
[2016-12-20T18:28:17,334][DEBUG][io.netty.util.Recycler   ] -Dio.netty.recycler.maxSharedCapacityFactor: 2
[2016-12-20T18:28:17,334][DEBUG][io.netty.util.Recycler   ] -Dio.netty.recycler.linkCapacity: 16

Is that really the only logs you're getting?

Yes, it is.
The log does not show any error or warning and continues with the following messages:

...
[2016-12-20T18:28:17,355][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305 => ECDHE-ECDSA-CHACHA20-POLY1305
[2016-12-20T18:28:17,355][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_CHACHA20_POLY1305 => ECDHE-ECDSA-CHACHA20-POLY1305
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305 => ECDHE-RSA-CHACHA20-POLY1305
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_CHACHA20_POLY1305 => ECDHE-RSA-CHACHA20-POLY1305
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_ECDHE_PSK_CHACHA20_POLY1305 => ECDHE-PSK-CHACHA20-POLY1305
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_ECDHE_PSK_CHACHA20_POLY1305 => ECDHE-PSK-CHACHA20-POLY1305
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
[2016-12-20T18:28:17,356][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
...
[2016-12-20T18:28:17,365][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2016-12-20T18:28:17,373][DEBUG][io.netty.channel.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 8
[2016-12-20T18:28:17,399][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.noKeySetOptimization: false
[2016-12-20T18:28:17,399][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
[2016-12-20T18:28:17,424][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2016-12-20T18:28:17,436][DEBUG][io.netty.util.internal.JavassistTypeParameterMatcherGenerator] Generated: io.netty.util.internal.__matchers__.org.logstash.beats.BatchMatcher
[2016-12-20T18:28:17,463][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.processId: 36543 (auto-detected)
[2016-12-20T18:28:17,466][DEBUG][io.netty.util.NetUtil    ] Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
[2016-12-20T18:28:17,467][DEBUG][io.netty.util.NetUtil    ] /proc/sys/net/core/somaxconn: 128
[2016-12-20T18:28:17,468][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.machineId: 00:50:56:ff:fe:92:41:01 (auto-detected)
[2016-12-20T18:28:17,628][DEBUG][logstash.inputs.file     ] _globbed_files: /var/log/remotesyslog/n2bb/d_log_n2bb.aa: glob is: ["/var/log/remotesyslog/n2bb/d_log_n2bb.aa"]
[2016-12-20T18:28:17,628][DEBUG][logstash.inputs.file     ] _discover_file: /var/log/remotesyslog/n2bb/d_log_n2bb.aa: new: /var/log/remotesyslog/n2bb/d_log_n2bb.aa (exclude is [])
[2016-12-20T18:28:17,629][DEBUG][logstash.inputs.file     ] _globbed_files: /var/log/remotesyslog/pruebatxt.log: glob is: ["/var/log/remotesyslog/pruebatxt.log"]
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] _discover_file: /var/log/remotesyslog/pruebatxt.log: new: /var/log/remotesyslog/pruebatxt.log (exclude is [])
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] _open_file: /var/log/remotesyslog/n2bb/d_log_n2bb.aa: opening
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] _open_file: /var/log/remotesyslog/pruebatxt.log: opening
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] /var/log/remotesyslog/n2bb/d_log_n2bb.aa: sincedb last value 71308, cur size 0
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] /var/log/remotesyslog/n2bb/d_log_n2bb.aa: last value size is greater than current value, starting over
[2016-12-20T18:28:17,630][DEBUG][logstash.inputs.file     ] /var/log/remotesyslog/pruebatxt.log: initial create, no sincedb, seeking to beginning of file
[2016-12-20T18:28:17,631][DEBUG][logstash.inputs.file     ] Received line {:path=>"/var/log/remotesyslog/pruebatxt.log", :text=>"CAMPO1;CAMPO2;CAMPO3;CAMPO4;CAMPO5"}
[2016-12-20T18:28:17,662][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-12-20 18:28:17 +0100}
...

I don't know where is the problem...

"That's the only logs I get" and "the logs continue with the following messages" are contradictory statements. Please provide the full logs (use a gist or pastebin) if you want any further help.

Ok, sorry.
This is the complete log: https://drive.google.com/open?id=0B_5I2i8gwlyDemxFVEY3dWQzVmc

I can not publish it completely because it is too large.

Thanks!

Here's the error at the end of the log (newlines inserted for readability):

[2016-12-20T18:32:32,736][ERROR][logstash.agent           ] Pipeline aborted due to error 
{:exception=>#<Sequel::DatabaseConnectionError: Java::JavaSql::SQLException: The Network Adapter could not establish the connection>,
:backtrace=>["oracle.jdbc.driver.SQLStateMapping.newSQLException(oracle/jdbc/driver/SQLStateMapping.java:70)", 
"oracle.jdbc.driver.DatabaseError.newSQLException(oracle/jdbc/driver/DatabaseError.java:133)", 
"oracle.jdbc.driver.DatabaseError.throwSqlException(oracle/jdbc/driver/DatabaseError.java:199)", 
"oracle.jdbc.driver.DatabaseError.throwSqlException(oracle/jdbc/driver/DatabaseError.java:480)", 
"oracle.jdbc.driver.T4CConnection.logon(oracle/jdbc/driver/T4CConnection.java:413)", 
"oracle.jdbc.driver.PhysicalConnection.<init>(oracle/jdbc/driver/PhysicalConnection.java:508)", 
"oracle.jdbc.driver.T4CConnection.<init>(oracle/jdbc/driver/T4CConnection.java:203)", 
"oracle.jdbc.driver.T4CDriverExtension.getConnection(oracle/jdbc/driver/T4CDriverExtension.java:33)",
"oracle.jdbc.driver.OracleDriver.connect(oracle/jdbc/driver/OracleDriver.java:510)", 
...

I've never used JDBC together with Oracle so I have no idea what "The Network Adapter could not establish the connection" could mean.

OK thank you very much.
I'll try to follow from here (sorry for the delay, I was on vacation)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.