Greetings,
I'm trying to create a test pipeline that takes data from MySQL and dumps it into Elasticsearch and so far so good, I'm finally able to get Data into the DB. Now what I would like to do is have logstash run every minute, but update the index any time a new product is added to the DB. How would I do something like that in the statement?
input {
jdbc {
type => "jdbc-demo"
jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.44-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/product_catalog"
jdbc_user => "root"
jdbc_password => "1234!"
schedule => "* * * * *"
statement => "SELECT name, model, manufacturer FROM products"
}
}
output {
stdout { codec => json_lines}
elasticsearch {
"hosts" => "localhost:9200"
"index" => "product_pipeline_import"
"document_id" => "%{name}"
}
}
Here's the test table that I'm testing with; nothing fancy
mysql> desc products;
+---------------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+---------------+--------------+------+-----+---------+-------+
| sku | varchar(255) | YES | | NULL | |
| name | varchar(255) | YES | | NULL | |
| type | varchar(255) | YES | | NULL | |
| price | varchar(255) | YES | | NULL | |
| upc | varchar(255) | YES | | NULL | |
| category_id | varchar(255) | YES | | NULL | |
| category_name | varchar(255) | YES | | NULL | |
| shipping | varchar(255) | YES | | NULL | |
| description | varchar(255) | YES | | NULL | |
| manufacturer | varchar(255) | YES | | NULL | |
| model | varchar(255) | YES | | NULL | |
| url | varchar(255) | YES | | NULL | |
| image | varchar(255) | YES | | NULL | |
+---------------+--------------+------+-----+---------+-------+