My input JDBC conf is not working .... [org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.]

I cannot manage to get data into my es instance.

This is what input jdbc looks like :slight_smile:

 jdbc {
        type => "B"
        jdbc_connection_string => "jdbc:oracle:thin:@//fff.ff.ff.f:1521/fff"
        jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
        jdbc_user => "fff"
        jdbc_password => "ffff"
        statement  => "select update_date from i_log order by insert_date desc"
        use_column_value => true
        tracking_column => "update_date"
        jdbc_paging_enabled => "true"
        jdbc_page_size => "50000"
        tracking_column_type => "numeric"
        schedule => "0 * * * *"
        clean_run => true
        last_run_metadata_path => "/data/application/.logstash_jdbc_last_run"

I heavily suspect that about Java configuration but dont kno how to fix this.

This is the output from my java version

-bash-4.2$ java -version
openjdk version "1.8.0_222"
OpenJDK Runtime Environment (Zulu (build 1.8.0_222-b10)
OpenJDK 64-Bit Server VM (Zulu (build 25.222-b10, mixed mode)

and the java Home Command gives ::



What is the output you get when you run this config?

You have not configured the "jdbc_driver_library" parameter. Don't know if it's mandatory, you might want to check that out.

Hope this helps

1 Like

Thanks for your reply.

I could solve the issue and it was about the ope java version i was using....
The openjdk version "1.8.0_222" is not working with logstash 7.5.
I downloaded now the version 11 and it works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.