Ingest with Jdbc input plugin in append

Hi at all,

I'm working with Jdbc input plugin that I use to ingest a table from mysql to elasticsearch.
Now I have a problem. Is it possible to ingest only the new data in Elasticsearch? I'd like to store data in append.

I thought that maybe it's possible with the WHERE closure and timestamp, like example snippet:

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "mysql"
    parameters => { "timestamp" => now - 1d }
    schedule => "0 23 * * *"
    statement => "SELECT * from songs where timestamp>= :timestamp"

But I don't know how store now.

Can you help me?
Maybe there is another method that I don't know.


Take a look at the documentation for tracking_column and :sql_last_value

1 Like

Thanks, I will check it and I will give you a feedback.

I solved with this code:

input {
  jdbc {
    jdbc_driver_library => "/opt/mysql-connector-java-5.1.46/mysql-connector-java-5.1.46.jar"
    jdbc_connection_string => "jdbc:mysql://database:3306/otrs?zeroDateTimeBehavior=convertToNull"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_user => "user"
    jdbc_password => "password"
    schedule => "* * * * *"
    statement => "SELECT * from dummy WHERE id > :sql_last_value"
    use_column_value => true
    tracking_column => "id"
    tracking_column_type => "numeric"
    jdbc_paging_enabled => "true"
    jdbc_page_size => "50000"
    tags => ["table_dummy"]

For very huge table I enabled paging (jdbc_paging_enabled, jdbc_page_size), then I tracked id (tracking_column) and I put it in WHERE statement (:sql_last_value point to last id).

Thanks for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.