Keep DB extracts up to date


Totally new to Logstash (and ELK in general). We've set everything up following the many online guides to extract product information from our DB and be able to query it in ES. But if the info changes in our DB, we need Logstash/ES to reflect the changes as soon as possible.

We created a logstash config as follows:

input {
jdbc {
jdbc_connection_string => "jdbc:mysql://svr-h003671.hayley-group.local:3306/masdata"
jdbc_user => "elastic"
jdbc_password => "xxxxxx"
jdbc_driver_library => "/usr/share/java/mysql-connector-java.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
# jdbc_page_size => "50000"
jdbc_fetch_size => "50000"
schedule => "5 * * * *"
statement => "SELECT item_code,item_description,brand_name FROM tbl_products p LEFT JOIN tbl_brands b ON b.brand_id = p.brand_id"
tags => "idx-md_descriptions"
output {
if "idx-md_descriptions" in [tags] {
elasticsearch {
hosts => [""]
index => "idx-md_descriptions"
document_id => "%{item_code}"

What configuration change do I need to tell it to refresh the existing data (or check for changes on a 'per row' level)?

Thank you.

Nevermind, it looks like it does actually update existing records automatically. Despite there being posts online discussing ways around this behaviour!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.