JDBC Oracle table to Elastic through logstash

Hi ,

I am trying to get oracle table data into Elastic through logstash (version 6.2, windows). I have created a pipe line. Added pipeline id into logstash.yml file and restarting logstash service. Pipeline is starting and seeing my sql query as info in logs and then pipeline terminated. Index get generated in Elastic but no data is in index. Not sure what is the issue?

input
 {
   jdbc
    {     
      jdbc_driver_library => "D:\elasticsearch\jars\ojdbc7.jar"
      jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"     
      jdbc_connection_string => "jdbc:oracle:thin:@<servername>:1521:<sid>"      
      jdbc_user => "userid"
      jdbc_password => "*****"
      statement => "select * from <schema.tablename>"      
    }
 }
 filter
 {
 }
 output
  {
    elasticsearch 
     {
      hosts => "https://<elasticserver>:9200/"
      index => "tempindex-%{+YYYY.MM}"
      document_type => "temp"
      user => <user>
      password => "*****"
      ssl => true
      ssl_certificate_verification => true
      cacert => "<.pem path>"
    }
  }

Logs
[2020-10-26T11:50:36,372][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"oracledata", :thread=>"#<Thread:0x66e32381 sleep>"}
[2020-10-26T11:50:36,373][INFO ][logstash.agent           ] Pipelines running {:count=>2, :pipelines=>["otherpipline", "oracledata"]}
[2020-10-26T11:50:36,817][INFO ][logstash.inputs.jdbc     ] (0.053214s) select * from <schema.table>
[2020-10-26T11:50:43,361][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>"oracledata", :thread=>"#<Thread:0x66e32381 run>"}

Hi,

Maybe it is a mapping conflict? Have you already checked if the dead-letter-queue is enabled - maybe the data is sent there?

Best regards
Wolfram

can you run this on command line to check if any thing is display on standard out?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.