Loading Oracle into Elastic with Logstash

I'm trying to load Oracle data into Elastic with Logstash.
The query returns a number of rows, but it only loaded one record into Elastic.
Is there anything wrong with my config file?

input {
jdbc {
jdbc_driver_library => "C:\jdbc\lib\ojdbc8.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@hostname:port:SID"
jdbc_user => "my_user"
jdbc_password => "my_password"
statement => "SELECT * FROM MY_TABLE"
}
}
output {
elasticsearch {
index => "my_index"
document_id => "%{table_id}"
hosts => "hostname:9200"
}
}

Try putting the tracking column too..

input {
  jdbc {
    statement => "SELECT * FROM MY_TABLE"
    use_column_value => true
    tracking_column_type => "numeric"
    tracking_column => "some_primary_key_or_sequence"
    last_run_metadata_path => "/elastic/tmp/testing/confs/test-jdbc-int-sql_last_value.yml"
    # ... other configuration bits
  }
}

Still only one record was loaded.
But thanks for your reply.

If every document has the same [table_id] field this will keep overwriting the document and you will only have one document in elasticsearch.

You get the point!
Thanks. It works well now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.